首页 | 本学科首页   官方微博 | 高级检索  
     检索      


New computational guarantees for solving convex optimization problems with first order methods,via a function growth condition measure
Authors:Robert M Freund  " target="_blank">Haihao Lu
Institution:1.MIT Sloan School of Management,Cambridge,USA;2.MIT Department of Mathematics and Operations Research Center,Cambridge,USA
Abstract:Motivated by recent work of Renegar (A framework for applying subgradient methods to conic optimization problems, arXiv:1503.02611v2, 2015) we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem \(f^* = \min _{x \in Q} f(x)\), where we presume knowledge of a strict lower bound \(f_{\mathrm{slb}}< f^*\). Indeed, \(f_{\mathrm{slb}}\) is naturally known when optimizing many loss functions in statistics and machine learning (least-squares, logistic loss, exponential loss, total variation loss, etc.) as well as in Renegar’s transformed version of the standard conic optimization problem arXiv:1503.02611v2, 2015; in all these cases one has \(f_{\mathrm{slb}}= 0 < f^*\).] We introduce a new functional measure called the growth constant G for \(f(\cdot )\), that measures how quickly the level sets of \(f(\cdot )\) grow relative to the function value, and that plays a fundamental role in the complexity analysis. When \(f(\cdot )\) is non-smooth, we present new computational guarantees for the Subgradient Descent Method and for smoothing methods, that can improve existing computational guarantees in several ways, most notably when the initial iterate \(x^0\) is far from the optimal solution set. When \(f(\cdot )\) is smooth, we present a scheme for periodically restarting the Accelerated Gradient Method that can also improve existing computational guarantees when \(x^0\) is far from the optimal solution set, and in the presence of added structure we present a scheme using parametrically increased smoothing that further improves the associated computational guarantees.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号