首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 78 毫秒
1.
1引言考虑如下优化问题: min f(x)=sum from i=1 to m f_i(x),s.t. x∈X (1)其中,f_i∶R~n→R是凸函数且f_i不可微,X是R~n上的非空闭凸子集.解(1)的主要方法  相似文献   

2.
We generalize the subgradient optimization method for nondifferentiable convex programming to utilize conditional subgradients. Firstly, we derive the new method and establish its convergence by generalizing convergence results for traditional subgradient optimization. Secondly, we consider a particular choice of conditional subgradients, obtained by projections, which leads to an easily implementable modification of traditional subgradient optimization schemes. To evaluate the subgradient projection method we consider its use in three applications: uncapacitated facility location, two-person zero-sum matrix games, and multicommodity network flows. Computational experiments show that the subgradient projection method performs better than traditional subgradient optimization; in some cases the difference is considerable. These results suggest that our simply modification may improve subgradient optimization schemes significantly. This finding is important as such schemes are very popular, especially in the context of Lagrangean relaxation.  相似文献   

3.
A generalized subgradient method is considered which uses the subgradients at previous iterations as well as the subgradient at current point. This method is a direct generalization of the usual subgradient method. We provide two sets of convergence conditions of the generalized subgradient method. Our results provide a larger class of sequences which converge to a minimum point and more freedom of adjustment to accelerate the speed of convergence.Most of this research was performed when the first author was visiting Decision and Information Systems Department, College of Business, Arizona State University.  相似文献   

4.
轩华  李冰 《运筹与管理》2015,24(6):121-127
为降低求解复杂度和缩短计算时间,针对多阶段混合流水车间总加权完成时间问题,提出了一种结合异步次梯度法的改进拉格朗日松弛算法。建立综合考虑有限等待时间和工件释放时间的整数规划数学模型,将异步次梯度法嵌入到拉格朗日松弛算法中,从而通过近似求解拉格朗日松弛问题得到一个合理的异步次梯度方向,沿此方向进行搜索,逐渐降低到最优点的距离。通过仿真实验,验证了所提算法的有效性。对比所提算法与传统的基于次梯度法的拉格朗日松弛算法,结果表明,就综合解的质量和计算效率而言,所提算法能在较短的计算时间内获得更好的近优解,尤其是对大规模问题。  相似文献   

5.
The subgradient extragradient method for solving the variational inequality (VI) problem, which is introduced by Censor et al. (J. Optim. Theory Appl. 148, 318–335, 2011), replaces the second projection onto the feasible set of the VI, in the extragradient method, with a subgradient projection onto some constructible half-space. Since the method has been introduced, many authors proposed extensions and modifications with applications to various problems. In this paper, we introduce a modified subgradient extragradient method by improving the stepsize of its second step. Convergence of the proposed method is proved under standard and mild conditions and primary numerical experiments illustrate the performance and advantage of this new subgradient extragradient variant.  相似文献   

6.
In a Hilbert space, we study the convergence of the subgradient method to a solution of a variational inequality, under the presence of computational errors. Most results known in the literature establish convergence of optimization algorithms, when computational errors are summable. In the present paper, the convergence of the subgradient method for solving variational inequalities is established for nonsummable computational errors. We show that the subgradient method generates a good approximate solution, if the sequence of computational errors is bounded from above by a constant.  相似文献   

7.
Surrogate Gradient Algorithm for Lagrangian Relaxation   总被引:6,自引:0,他引:6  
The subgradient method is used frequently to optimize dual functions in Lagrangian relaxation for separable integer programming problems. In the method, all subproblems must be solved optimally to obtain a subgradient direction. In this paper, the surrogate subgradient method is developed, where a proper direction can be obtained without solving optimally all the subproblems. In fact, only an approximate optimization of one subproblem is needed to get a proper surrogate subgradient direction, and the directions are smooth for problems of large size. The convergence of the algorithm is proved. Compared with methods that take effort to find better directions, this method can obtain good directions with much less effort and provides a new approach that is especially powerful for problems of very large size.  相似文献   

8.
In many instances, the exact evaluation of an objective function and its subgradients can be computationally demanding. By way of example, we cite problems that arise within the context of stochastic optimization, where the objective function is typically defined via multi-dimensional integration. In this paper, we address the solution of such optimization problems by exploring the use of successive approximation schemes within subgradient optimization methods. We refer to this new class of methods as inexact subgradient algorithms. With relatively mild conditions imposed on the approximations, we show that the inexact subgradient algorithms inherit properties associated with their traditional (i.e., exact) counterparts. Within the context of stochastic optimization, the conditions that we impose allow a relaxation of requirements traditionally imposed on steplengths in stochastic quasi-gradient methods. Additionally, we study methods in which steplengths may be defined adaptively, in a manner that reflects the improvement in the objective function approximations as the iterations proceed. We illustrate the applicability of our approach by proposing an inexact subgradient optimization method for the solution of stochastic linear programs.This work was supported by Grant Nos. NSF-DDM-89-10046 and NSF-DDM-9114352 from the National Science Foundation.  相似文献   

9.
Computational Optimization and Applications - A method, called an augmented subgradient method, is developed to solve unconstrained nonsmooth difference of convex (DC) optimization problems. At...  相似文献   

10.
Journal of Optimization Theory and Applications - We study a first-order primal-dual subgradient method to optimize risk-constrained risk-penalized optimization problems, where risk is modeled via...  相似文献   

11.
We propose a weighting subgradient algorithm for solving multiobjective minimization problems on a nonempty closed convex subset of an Euclidean space. This method combines weighting technique and the classical projected subgradient method, using a divergent series steplength rule. Under the assumption of convexity, we show that the sequence generated by this method converges to a Pareto optimal point of the problem. Some numerical results are presented.  相似文献   

12.
We replace orthogonal projections in the Polyak subgradient method for nonnegatively constrained minimization with entropic projections, thus obtaining an interior-point subgradient method. Inexact entropic projections are quite cheap. Global convergence of the resulting method is established.  相似文献   

13.
Numerical Algorithms - In this paper, basing on the subgradient extragradient method and inertial method with line-search process, we introduce two new algorithms for finding a common element of...  相似文献   

14.
The subgradient method is both a heavily employed and widely studied algorithm for non-differentiable optimization. Nevertheless, there are some basic properties of subgradient optimization that, while “well known” to specialists, seem to be rather poorly known in the larger optimization community. This note concerns two such properties, both applicable to subgradient optimization using the divergent series steplength rule. The first involves convergence of the iterative process, and the second deals with the construction of primal estimates when subgradient optimization is applied to maximize the Lagrangian dual of a linear program. The two topics are related in that convergence of the iterates is required to prove correctness of the primal construction scheme. Dedicated to B.T. Polyak on the occassion of his 70th birthday.  相似文献   

15.
A cyclically controlled method of subgradient projections (CSP) for the convex feasibility problem of solving convex inequalities is presented. The features of this method make it an efficient tool in handling huge and sparse problems. A particular application to an image reconstruction problem of emission computerized tomography is mentioned.Research supported by National Institute of Health Grant HL 28438-01.  相似文献   

16.
Anh  Pham Ngoc  Tu  Ho Phi 《Numerical Algorithms》2021,86(1):55-74
Numerical Algorithms - In this paper, by basing on the inexact subgradient and projection methods presented by Santos et al. (Comput. Appl. Math. 30: 91–107, 2011), we develop subgradient...  相似文献   

17.
We study conditions for convergence of a generalized subgradient algorithm in which a relaxation step is taken in a direction, which is a convex combination of possibly all previously generated subgradients. A simple condition for convergence is given and conditions that guarantee a linear convergence rate are also presented. We show that choosing the steplength parameter and convex combination of subgradients in a certain sense optimally is equivalent to solving a minimum norm quadratic programming problem. It is also shown that if the direction is restricted to be a convex combination of the current subgradient and the previous direction, then an optimal choice of stepsize and direction is equivalent to the Camerini—Fratta—Maffioli modification of the subgradient method.Research supported by the Swedish Research Council for Engineering Sciences (TFR).  相似文献   

18.
We study subgradient methods for computing the saddle points of a convex-concave function. Our motivation comes from networking applications where dual and primal-dual subgradient methods have attracted much attention in the design of decentralized network protocols. We first present a subgradient algorithm for generating approximate saddle points and provide per-iteration convergence rate estimates on the constructed solutions. We then focus on Lagrangian duality, where we consider a convex primal optimization problem and its Lagrangian dual problem, and generate approximate primal-dual optimal solutions as approximate saddle points of the Lagrangian function. We present a variation of our subgradient method under the Slater constraint qualification and provide stronger estimates on the convergence rate of the generated primal sequences. In particular, we provide bounds on the amount of feasibility violation and on the primal objective function values at the approximate solutions. Our algorithm is particularly well-suited for problems where the subgradient of the dual function cannot be evaluated easily (equivalently, the minimum of the Lagrangian function at a dual solution cannot be computed efficiently), thus impeding the use of dual subgradient methods.  相似文献   

19.
In this paper a new algorithm for minimizing locally Lipschitz functions is developed. Descent directions in this algorithm are computed by solving a system of linear inequalities. The convergence of the algorithm is proved for quasidifferentiable semismooth functions. We present the results of numerical experiments with both regular and nonregular objective functions. We also compare the proposed algorithm with two different versions of the subgradient method using the results of numerical experiments. These results demonstrate the superiority of the proposed algorithm over the subgradient method.   相似文献   

20.
First, this paper deals with lagrangean heuristics for the 0-1 bidimensional knapsack problem. A projected subgradient algorithm is performed for solving a lagrangean dual of the problem, to improve the convergence of the classical subgradient algorithm. Secondly, a local search is introduced to improve the lower bound on the value of the biknapsack produced by lagrangean heuristics. Thirdly, a variable fixing phase is embedded in the process. Finally, the sequence of 0-1 one-dimensional knapsack instances obtained from the algorithm are solved by using reoptimization techniques in order to reduce the total computational time effort. Computational results are presented.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号