首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3篇
  免费   0篇
数学   3篇
  1980年   2篇
  1978年   1篇
排序方式: 共有3条查询结果,搜索用时 46 毫秒
1
1.
It is known that augmented Lagrangian or multiplier methods for solving constrained optimization problems can be interpreted as techniques for maximizing an augmented dual functionD c(). For a constantc sufficiently large, by considering maximizing the augmented dual functionD c() with respect to, it is shown that the Newton iteration for based on maximizingD c() can be decomposed into taking a Powell/Hestenes iteration followed by a Newton-like correction. Superimposed on the original Powell/Hestenes method, a simple acceleration technique is devised to make use of information from the previous iteration. For problems with only one constraint, the acceleration technique is equivalent to replacing the second (Newton-like) part of the decomposition by a finite difference approximation. Numerical results are presented.  相似文献   
2.
An implicit function theorem   总被引:1,自引:0,他引:1  
Suppose thatF:DR n×RmRn, withF(x 0,y 0)=0. The classical implicit function theorem requires thatF is differentiable with respect tox and moreover that 1 F(x 0,y 0) is nonsingular. We strengthen this theorem by removing the nonsingularity and differentiability requirements and by replacing them with a one-to-one condition onF as a function ofx.  相似文献   
3.
Summary Strong uniqueness has proved to be an important condition in demonstrating the second order convergence of the generalised Gauss-Newton method for discrete nonlinear approximation problems [4]. Here we compare strong uniqueness with the multiplier condition which has also been used for this purpose. We describe strong uniqueness in terms of the local geometry of the unit ball and properties of the problem functions at the minimum point. When the norm is polyhedral we are able to give necessary and sufficient conditions for the second order convergence of the generalised Gauss-Newton algorithm.  相似文献   
1
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号