共查询到20条相似文献,搜索用时 15 毫秒
1.
Maxim V. Balashov Maxim O. Golubev 《Journal of Mathematical Analysis and Applications》2012,394(2):545-551
We consider the metric projection operator from the real Hilbert space onto a strongly convex set. We prove that the restriction of this operator on the complement of some neighborhood of the strongly convex set is Lipschitz continuous with the Lipschitz constant strictly less than 1. This property characterizes the class of strongly convex sets and (to a certain degree) the Hilbert space. We apply the results obtained to the question concerning the rate of convergence for the gradient projection algorithm with differentiable convex function and strongly convex set. 相似文献
2.
给求解无约束规划问题的记忆梯度算法中的参数一个特殊取法,得到目标函数的记忆梯度G o ldste in-L av in tin-Po lyak投影下降方向,从而对凸约束的非线性规划问题构造了一个记忆梯度G o ldste in-L av in tin-Po lyak投影算法,并在一维精确步长搜索和去掉迭代点列有界的条件下,分析了算法的全局收敛性,得到了一些较为深刻的收敛性结果.同时给出了结合FR,PR,HS共轭梯度算法的记忆梯度G o ldste in-L av in tin-Po lyak投影算法,从而将经典共轭梯度算法推广用于求解凸约束的非线性规划问题.数值例子表明新算法比梯度投影算法有效. 相似文献
3.
A FAMILY OF GRADIENT PROJECTION METHODS 总被引:1,自引:1,他引:0
堵丁柱 《应用数学学报(英文版)》1985,2(1):1-13
In this paper,we give a family of gradient projection methods with three parameters and theirconvergent properties.This family includes Rosen's gradient projection method,Wolfe's reducedgradient method and Zangwill's convex simplex method as its special cases. 相似文献
4.
5.
Naihua Xiu Changyu Wang Lingchen Kong 《计算数学(英文版)》2007,25(2):221-230
In this paper, we give some convergence results on the gradient projection method with exact stepsize rule for solving the minimization problem with convex constraints. Especially, we show that if the objective function is convex and its gradient is Lipschitz continuous, then the whole sequence of iterations produced by this method with bounded exact stepsizes converges to a solution of the concerned problem. 相似文献
6.
并行技术在约束凸规划化问题的对偶算法中的应用 总被引:1,自引:0,他引:1
用 Rosen(196 1)的投影梯度的方法求解约束凸规划化问题的对偶问题 ,在计算投影梯度方向时 ,涉及求关于原始变量的最小化问题的最优解 .我们用并行梯度分布算法 (PGD)计算出这一极小化问题的近似解 ,证明近似解可以达到任何给定的精度 ,并说明当精度选取合适时 ,Rosen方法仍然是收敛的 相似文献
7.
8.
考虑约束最优化问题:minx∈Ωf(x)其中:f:R^n→R是连续可微函数,Ω是一闭凸集。本文研究了解决此问题的梯度投影方法,在步长的选取时采用了一种新的策略,在较弱的条件下,证明了梯度投影响方法的全局收敛性。 相似文献
9.
Ferreira O. P. Lemes M. Prudente L. F. 《Computational Optimization and Applications》2022,81(1):91-125
Computational Optimization and Applications - The purpose of this paper is to present an inexact version of the scaled gradient projection method on a convex set, which is inexact in two sense.... 相似文献
10.
Mathematical Notes - Let a weakly convex function (in the general case, nonconvex and nonsmooth) satisfy the quadratic growth condition. It is proved that the gradient projection method for... 相似文献
11.
12.
Based on a modified line search scheme, this paper presents a new derivative-free projection method for solving nonlinear monotone equations with convex constraints, which can be regarded as an extension of the scaled conjugate gradient method and the projection method. Under appropriate conditions, the global convergence and linear convergence rate of the proposed method is proven. Preliminary numerical results are also reported to show that this method is promising. 相似文献
13.
《Nonlinear Analysis: Theory, Methods & Applications》2004,59(3):385-405
We study a steered sequential gradient algorithm which minimizes the sum of convex functions by proceeding cyclically in the directions of the negative gradients of the functions and using steered step-sizes. This algorithm is applied to the convex feasibility problem by minimizing a proximity function which measures the sum of the Bregman distances to the members of the family of convex sets. The resulting algorithm is a new steered sequential Bregman projection method which generates sequences that converge if they are bounded, regardless of whether the convex feasibility problem is or is not consistent. For orthogonal projections and affine sets the boundedness condition is always fulfilled. 相似文献
14.
Yu. A. Chernyaev 《Computational Mathematics and Mathematical Physics》2016,56(10):1716-1731
The gradient projection method and Newton’s method are generalized to the case of nonconvex constraint sets representing the set-theoretic intersection of a spherical surface with a convex closed set. Necessary extremum conditions are examined, and the convergence of the methods is analyzed. 相似文献
15.
基于修正拟牛顿方程,利用Goldstein-Levitin-Polyak(GLP)投影技术,建立了求解带凸集约束的优化问题的两阶段步长非单调变尺度梯度投影算法,证明了算法的全局收敛性和一定条件下的Q超线性收敛速率.数值结果表明新算法是有效的,适合求解大规模问题. 相似文献
16.
17.
在本文中,我们提出了一种新的带扰动项的三项记忆梯度混合投影算法.在这种方法中应用了广义Armijo线搜索,并且仅在梯度函数在包含迭代序列的开凸集上一致连续的条件下证明了该算法的全局收敛性.最后给出了几个数值算例. 相似文献
18.
This paper is concerned with the asymptotic analysis of the trajectories of some dynamical systems built upon the gradient projection method in Hilbert spaces. For a convex function with locally Lipschitz gradient, it is proved that the orbits converge weakly to a constrained minimizer whenever it exists. This result remains valid even if the initial condition is chosen out of the feasible set and it can be extended in some sense to quasiconvex functions. An asymptotic control result, involving a Tykhonov-like regularization, shows that the orbits can be forced to converge strongly toward a well-specified minimizer. In the finite-dimensional framework, we study the differential inclusion obtained by replacing the classical gradient by the subdifferential of a continuous convex function. We prove the existence of a solution whose asymptotic properties are the same as in the smooth case. 相似文献
19.
Sergei P. Gilyazov 《Numerical Functional Analysis & Optimization》2013,34(3-4):309-327
The ill-posed minimization problems in Hilbert space with quadratic objective function and closed convex constraint set are considered. For the compact set the regularization methods for such problems are well understood [1, 2] The regularizing properties of some Iteration projection methods for noncompact constraint set are the main issues of this paper. We are looking the gradient projection method for the sphere. 相似文献
20.
I. V. Konnov 《Optimization》2018,67(5):665-682
We suggest simple implementable modifications of conditional gradient and gradient projection methods for smooth convex optimization problems in Hilbert spaces. Usually, the custom methods attain only weak convergence. We prove strong convergence of the new versions and establish their complexity estimates, which appear similar to the convergence rate of the weakly convergent versions. Preliminary results of computational tests confirm efficiency of the proposed modification. 相似文献