首页 | 本学科首页   官方微博 | 高级检索  
     检索      

一类非光滑凸优化问题的邻近梯度算法
引用本文:李红武,谢敏,张榕.一类非光滑凸优化问题的邻近梯度算法[J].运筹学学报,2021,25(1):61-72.
作者姓名:李红武  谢敏  张榕
作者单位:1. 北京工业大学应用数理学院, 北京 100124;2. 南阳师范学院数学与统计学院, 河南南阳 473061;3. 汉能薄膜发电集团总部, 北京 100101
基金项目:国家自然科学基金(No.11771003)。
摘    要:考虑求解目标函数为光滑损失函数与非光滑正则函数之和的凸优化问题的一种基于线搜索的邻近梯度算法及其收敛性分析,证明了在梯度局部Lipschitz连续条件下该算法是$R$-线性收敛的,并在非光滑部分为稀疏块LASSO正则函数情况下给出了误差界条件成立的证明,得到了线性收敛率。最后,数值实验结果验证了方法的有效性。

关 键 词:非光滑凸优化  邻近梯度法  局部Lipschitz连续  误差界  线性收敛  
收稿时间:2019-04-01

A proximal gradient method for nonsmooth convex optimization problems
LI Hongwu,XIE Min,ZHANG Rong.A proximal gradient method for nonsmooth convex optimization problems[J].OR Transactions,2021,25(1):61-72.
Authors:LI Hongwu  XIE Min  ZHANG Rong
Institution:1. College of Applied Sciences, Beijing University of Technology, Beijing 100124, China;2. School of mathematics and statistics, Nanyang Normal University, Nanyang 473061, Henan, China;3. Hanergy Thin Film Power Group Head Quaters, Beijing 100101, China
Abstract:A Proximal Gradient Method based on linesearch(L-PGM)and its convergence for solving the convex optimization problems which objective function is the sum of smooth loss function and non-smooth regular function are studied in this paper.Considering the loss function’s gradient is locally Lipschitz continuous in the problems,the R-linear convergence rate of the L-PGM method is proved.Then,focusing on the problems regularized by the sparse group Lasso function,we prove that the error bound holds around the optimal solution set,thus,the linear convergence for solving such problems with the L-PGM method is given.Finally,The preliminary experimental results support our theoretical analysis.
Keywords:nonsmooth convex optimization  proximal gradient method  locally lipschitz continuous  error bound  linear convergence
本文献已被 CNKI 维普 等数据库收录!
点击此处可从《运筹学学报》浏览原始摘要信息
点击此处可从《运筹学学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号