共查询到20条相似文献,搜索用时 15 毫秒
1.
Alexander J. Zaslavski 《Numerical Functional Analysis & Optimization》2013,34(12):1399-1412
In this article, we study convergence of the extragradient method for constrained convex minimization problems in a Hilbert space. Our goal is to obtain an ε-approximate solution of the problem in the presence of computational errors, where ε is a given positive number. Most results known in the literature establish convergence of optimization algorithms, when computational errors are summable. In this article, the convergence of the extragradient method for solving convex minimization problems is established for nonsummable computational errors. We show that the the extragradient method generates a good approximate solution, if the sequence of computational errors is bounded from above by a constant. 相似文献
2.
In this article, we propose a strongly convergent variant on the projected subgradient method for constrained convex minimization problems in Hilbert spaces. The advantage of the proposed method is that it converges strongly when the problem has solutions, without additional assumptions. The method also has the following desirable property: the sequence converges to the solution of the problem which lies closest to the initial iterate. 相似文献
3.
The global optimization problem is considered under the assumption that the objective function is convex with respect to some variables. A finite subgradient algorithm for the search of an -optimal solution is proposed. Results of numerical experiments are presented. 相似文献
4.
5.
I. V. Konnov 《Optimization》2018,67(5):665-682
We suggest simple implementable modifications of conditional gradient and gradient projection methods for smooth convex optimization problems in Hilbert spaces. Usually, the custom methods attain only weak convergence. We prove strong convergence of the new versions and establish their complexity estimates, which appear similar to the convergence rate of the weakly convergent versions. Preliminary results of computational tests confirm efficiency of the proposed modification. 相似文献
6.
We consider the method for constrained convex optimization in a Hilbert space, consisting of a step in the direction opposite to an
k
-subgradient of the objective at a current iterate, followed by an orthogonal projection onto the feasible set. The normalized stepsizes
k
are exogenously given, satisfying
k=0
k = ,
k=0
k
2
< , and
k is chosen so that
k k for some > 0. We prove that the sequence generated in this way is weakly convergent to a minimizer if the problem has solutions, and is unbounded otherwise. Among the features of our convergence analysis, we mention that it covers the nonsmooth case, in the sense that we make no assumption of differentiability off, and much less of Lipschitz continuity of its gradient. Also, we prove weak convergence of the whole sequence, rather than just boundedness of the sequence and optimality of its weak accumulation points, thus improving over all previously known convergence results. We present also convergence rate results. © 1998 The Mathematical Programming Society, Inc. Published by Elsevier Science B.V.Research of this author was partially supported by CNPq grant nos. 301280/86 and 300734/95-6. 相似文献
7.
H 《Mathematical Methods of Operations Research》1991,35(5):401-424
We give a method for solving the inclusion 0F(x), whereF is a set-valued map from a Hilbert space into a Banach space, whose graph is a closed convex set. The given algorithms are convergent if the problem is consistent and terminate after a finite number of iterations if otherwise. A convergence rate of geometrical progression is also obtained if a regularity condition of the Robinson's type is assumed.The authors are grateful to the referees for valuable comments and suggestions. Special thanks are given to Prof. S.M. Robinson for advices. 相似文献
8.
K. C. Kiwiel 《Journal of Optimization Theory and Applications》1998,96(1):159-173
We replace orthogonal projections in the Polyak subgradient method for nonnegatively constrained minimization with entropic projections, thus obtaining an interior-point subgradient method. Inexact entropic projections are quite cheap. Global convergence of the resulting method is established. 相似文献
9.
In this paper, we take advantage of the availability of higher-order derivatives through the table method (see Ref. 1) and suggest a simple variant of the Lagrangian method for constrained optimization. Our method, and the software that we currently have can be used to minimize functions with many variables subject to an arbitrary number of constraints.On leave from the Faculty of Management, Tel Aviv University, Tel Aviv, Israel. 相似文献
10.
In this paper, we propose an easily implementable algorithm in Hilbert spaces for solving some classical monotone variational inequality problem over the set of solutions of mixed variational inequalities. The proposed method combines two strategies: projected subgradient techniques and viscosity-type approximations. The involved stepsizes are controlled and a strong convergence theorem is established under very classical assumptions. Our algorithm can be applied for instance to some mathematical programs with complementarity constraints. 相似文献
11.
A filled function method for constrained global optimization 总被引:1,自引:0,他引:1
In this paper, a filled function method for solving constrained global optimization problems is proposed. A filled function
is proposed for escaping the current local minimizer of a constrained global optimization problem by combining the idea of
filled function in unconstrained global optimization and the idea of penalty function in constrained optimization. Then a
filled function method for obtaining a global minimizer or an approximate global minimizer of the constrained global optimization
problem is presented. Some numerical results demonstrate the efficiency of this global optimization method for solving constrained
global optimization problems. 相似文献
12.
13.
We study a special dual form of a convex minimization problem in a Hilbert space, which is formally suggested by Fenchel dualityand is useful for the Dykstra algorithm. For this special duality problem, we prove that strong duality holds if and only if the collection of underlying constraint sets {C
1,...,C
m} has the strong conical hull intersection property. That is,
where D° denotes the dual cone of D. In general, we can establish weak duality for a convex minimization problem in a Hilbert space by perturbing the constraint sets so that the perturbed sets have the strong conical hull intersection property. This generalizes a result of Gaffke and Mathar. 相似文献
14.
This paper is concerned with an algorithmic solution to the split common fixed point problem in Hilbert spaces. Our method can be regarded as a variant of the “viscosity approximation method”. Under very classical assumptions, we establish a strong convergence theorem with regard to involved operators belonging to the wide class of quasi-nonexpansive operators. In contrast with other related processes, our algorithm does not require any estimate of some spectral radius. The technique of analysis developed in this work is new and can be applied to many other fixed point iterations. Numerical experiments are also performed with regard to an inverse heat problem. 相似文献
15.
Lixin Cheng Mariá n Fabian 《Proceedings of the American Mathematical Society》2001,129(12):3539-3541
This paper shows that the product of a Gâteaux differentiability space and a separable Banach space is again a Gâteaux differentiability space.
16.
Polyak's subgradient algorithm for nondifferentiable optimization problems requires prior knowledge of the optimal value of the objective function to find an optimal solution. In this paper we extend the convergence properties of the Polyak's subgradient algorithm with a fixed target value to a more general case with variable target values. Then a target value updating scheme is provided which finds an optimal solution without prior knowledge of the optimal objective value. The convergence proof of the scheme is provided and computational results of the scheme are reported.Most of this research was performed when the first author was visiting Decision and Information Systems Department, College of Business, Arizona State University. 相似文献
17.
We consider a dynamical system approach to solve finite-dimensional smooth optimization problems with a compact and connected
feasible set. In fact, by the well-known technique of equalizing inequality constraints using quadratic slack variables, we
transform a general optimization problem into an associated problem without inequality constraints in a higher-dimensional
space. We compute the projected gradient for the latter problem and consider its projection on the feasible set in the original,
lower-dimensional space. In this way, we obtain an ordinary differential equation in the original variables, which is specially
adapted to treat inequality constraints (for the idea, see Jongen and Stein, Frontiers in Global Optimization, pp. 223–236,
Kluwer Academic, Dordrecht, 2003).
The article shows that the derived ordinary differential equation possesses the basic properties which make it appropriate
to solve the underlying optimization problem: the longtime behavior of its trajectories becomes stationary, all singularities
are critical points, and the stable singularities are exactly the local minima. Finally, we sketch two numerical methods based
on our approach. 相似文献
18.
A locally convex space is said to be a Gateaux differentiability space (GDS) provided every continuous convex function defined on a nonempty convex open subset D of the space is densely Gateaux differentiable in .D.This paper shows that the product of a GDS and a family of separable Prechet spaces is a GDS,and that the product of a GDS and an arbitrary locally convex space endowed with the weak topology is a GDS. 相似文献
19.
本文使用信赖域策略结合投影梯度算法来解约束优化问题,并给出算法及其收敛性。进一步,给出了收敛点具有满足约束问题一阶和二阶必要性的性质。 相似文献
20.
《Optimization》2012,61(6):885-902
A general iterative scheme including relaxation and a corresponding problem class are presented. Some global convergence results are given. The acceleration of convergence is discussed, The scheme comprises a lot of known iterative methods such as subgradient methods and methods of successive orthogonal projections with relaxation. Applications to convex optimization, convex feasibility problems, systems of convex inequalities, variational inequalities, operator equations and systems of linear equations are given. 相似文献