Abstract convergence theorem for quasi-convex optimization problems with applications |
| |
Authors: | Carisa Kwok Wai Yu Yaohua Hu Siu Kai Choy |
| |
Affiliation: | 1. Department of Mathematics and Statistics, Hang Seng Management College, Shatin, Hong Kong.;2. College of Mathematics and Statistics, Shenzhen University, Shenzhen, P.R. China. |
| |
Abstract: | AbstractQuasi-convex optimization is fundamental to the modelling of many practical problems in various fields such as economics, finance and industrial organization. Subgradient methods are practical iterative algorithms for solving large-scale quasi-convex optimization problems. In the present paper, focusing on quasi-convex optimization, we develop an abstract convergence theorem for a class of sequences, which satisfy a general basic inequality, under some suitable assumptions on parameters. The convergence properties in both function values and distances of iterates from the optimal solution set are discussed. The abstract convergence theorem covers relevant results of many types of subgradient methods studied in the literature, for either convex or quasi-convex optimization. Furthermore, we propose a new subgradient method, in which a perturbation of the successive direction is employed at each iteration. As an application of the abstract convergence theorem, we obtain the convergence results of the proposed subgradient method under the assumption of the Hölder condition of order p and by using the constant, diminishing or dynamic stepsize rules, respectively. A preliminary numerical study shows that the proposed method outperforms the standard, stochastic and primal-dual subgradient methods in solving the Cobb–Douglas production efficiency problem. |
| |
Keywords: | Quasi-convex programming subgradient method basic inequality abstract convergence theorem Cobb–Douglas production efficiency problem |
|
|