首页 | 本学科首页   官方微博 | 高级检索  
     

A CLASS OF REDUCED GRADIENT METHODS FOR HANDLING OPTIMIZATION PROBLEMS WITH LINEAR INEQUALITY CONSTRAINTS
引用本文:徐成贤,魏斌. A CLASS OF REDUCED GRADIENT METHODS FOR HANDLING OPTIMIZATION PROBLEMS WITH LINEAR INEQUALITY CONSTRAINTS[J]. 高校应用数学学报(A辑), 1992, 0(3)
作者姓名:徐成贤  魏斌
作者单位:Department of Mathematics,Xi'an Jiaotong University,Xi'an 710049,Department of Mathematics,Xi'an Jiaotong University,Xi'an 710049
摘    要:A class of reduced gradient methods for handling general optimization problems with linear equality and inequality constraints is suggested in this paper. Although a slack vector is introduced, the dimension of the problem is not increased, which is unlike the conventional way of transferring the inequality constraints into the equality constraints by introducing slack variables. When an iterate x(k) is not a K-T point of the problem under consideration, different feasible descent directions can be obtained by different choices of the slack vectors. The suggested method is globally convergent and the numerical experiment given in the paper shows that the method is efficient.


A CLASS OF REDUCED GRADIENT METHODS FOR HANDLING OPTIMIZATION PROBLEMS WITH LINEAR INEQUALITY CONSTRAINTS
Xu Chengxian Wei Bin. A CLASS OF REDUCED GRADIENT METHODS FOR HANDLING OPTIMIZATION PROBLEMS WITH LINEAR INEQUALITY CONSTRAINTS[J]. Applied Mathematics A Journal of Chinese Universities, 1992, 0(3)
Authors:Xu Chengxian Wei Bin
Abstract:A class of reduced gradient methods for handling general optimization problems with linear equality and inequality constraints is suggested in this paper. Although a slack vector is introduced, the dimension of the problem is not increased, which is unlike the conventional way of transferring the inequality constraints into the equality constraints by introducing slack variables. When an iterate x(k) is not a K-T point of the problem under consideration, different feasible descent directions can be obtained by different choices of the slack vectors. The suggested method is globally convergent and the numerical experiment given in the paper shows that the method is efficient.
Keywords:Nonlinear Programming   Reduced Gradient Method   Global Convergence.
本文献已被 CNKI 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号