首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Optimal Rates for the Regularized Least-Squares Algorithm
Authors:A Caponnetto  E De Vito
Institution:(1) Department of Computer Science, University of Chicago, 1100 East 58th Street, Chicago, IL 60637 and D.I.S.I., Universita di Genova, Via Dodecaneso 35 16146, Genova, Italy;(2) Dipartimento di Matematica, Universita di Modena, Via Campi 213/B, 41100 Modena, Italy and I.N.F.N., Sezione di Genova, Via Dodecaneso 33 16146, Genova, Italy
Abstract:We develop a theoretical analysis of the performance of the regularized least-square algorithm on a reproducing kernel Hilbert space in the supervised learning setting. The presented results hold in the general framework of vector-valued functions; therefore they can be applied to multitask problems. In particular, we observe that the concept of effective dimension plays a central role in the definition of a criterion for the choice of the regularization parameter as a function of the number of samples. Moreover, a complete minimax analysis of the problem is described, showing that the convergence rates obtained by regularized least-squares estimators are indeed optimal over a suitable class of priors defined by the considered kernel. Finally, we give an improved lower rate result describing worst asymptotic behavior on individual probability measures rather than over classes of priors.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号