排序方式: 共有3条查询结果,搜索用时 46 毫秒
1
1.
Krisorn Jittorntrum 《Mathematical Programming》1980,18(1):197-214
It is known that augmented Lagrangian or multiplier methods for solving constrained optimization problems can be interpreted as techniques for maximizing an augmented dual functionD
c(). For a constantc sufficiently large, by considering maximizing the augmented dual functionD
c() with respect to, it is shown that the Newton iteration for based on maximizingD
c() can be decomposed into taking a Powell/Hestenes iteration followed by a Newton-like correction. Superimposed on the original Powell/Hestenes method, a simple acceleration technique is devised to make use of information from the previous iteration. For problems with only one constraint, the acceleration technique is equivalent to replacing the second (Newton-like) part of the decomposition by a finite difference approximation. Numerical results are presented. 相似文献
2.
An implicit function theorem 总被引:1,自引:0,他引:1
K. Jittorntrum 《Journal of Optimization Theory and Applications》1978,25(4):575-577
Suppose thatF:DR
n×RmRn, withF(x
0,y
0)=0. The classical implicit function theorem requires thatF is differentiable with respect tox and moreover that 1
F(x
0,y
0) is nonsingular. We strengthen this theorem by removing the nonsingularity and differentiability requirements and by replacing them with a one-to-one condition onF as a function ofx. 相似文献
3.
Summary Strong uniqueness has proved to be an important condition in demonstrating the second order convergence of the generalised Gauss-Newton method for discrete nonlinear approximation problems [4]. Here we compare strong uniqueness with the multiplier condition which has also been used for this purpose. We describe strong uniqueness in terms of the local geometry of the unit ball and properties of the problem functions at the minimum point. When the norm is polyhedral we are able to give necessary and sufficient conditions for the second order convergence of the generalised Gauss-Newton algorithm. 相似文献
1