Karmarkar's linear programming algorithm and Newton's method |
| |
Authors: | D A Bayer J C Lagarias |
| |
Institution: | (1) Columbia University, New York, NY, USA;(2) AT&T Bell Laboratories, Murray Hill, NJ, USA |
| |
Abstract: | This paper describes a full-dimensional version of Karmarkar's linear programming algorithm, theprojective scaling algorithm, which is defined for any linear program in
n
having a bounded, full-dimensional polytope of feasible solutions. If such a linear program hasm inequality constraints, then it is equivalent under an injective affine mappingJ:
n
m
to Karmarkar's original algorithm for a linear program in
m
havingm—n equality constraints andm inequality constraints. Karmarkar's original algorithm minimizes a potential functiong(x), and the projective scaling algorithm is equivalent to that version of Karmarkar's algorithm whose step size minimizes the potential function in the step direction.The projective scaling algorithm is shown to be a global Newton method for minimizing a logarithmic barrier function in a suitable coordinate system. The new coordinate system is obtained from the original coordinate system by a fixed projective transformationy = (x) which maps the hyperplaneH
opt ={x:c, x =c
0} specified by the optimal value of the objective function to the hyperplane at infinity. The feasible solution set is mapped under to anunbounded polytope. Letf
LB(y) denote the logarithmic barrier function associated to them inequality constraints in the new coordinate system. It coincides up to an additive constant with Karmarkar's potential function in the new coordinate system. Theglobal Newton method iterate
y
* for a strictly convex functionf(y) defined on a suitable convex domain is that pointy
* that minimizesf(y) on the search ray {y+
v
N(y): 0} wherev
N(y) =–(2
f(y))–1(f(y)) is the Newton's method vector. If {x
(k)} are a set of projective scaling algorithm iterates in the original coordinate system andy
(k) =(x
(k)) then {y
(k)} are a set of global Newton method iterates forf
LB(y) and conversely.Karmarkar's algorithm with step size chosen to minimize the potential function is known to converge at least at a linear rate. It is shown (by example) that this algorithm does not have a superlinear convergence rate. |
| |
Keywords: | Karmarkar's algorithm modified Newton's method global Newton's method projective transformation logarithmic barrier function nonlinear optimization |
本文献已被 SpringerLink 等数据库收录! |
|