首页 | 本学科首页   官方微博 | 高级检索  
     


Rates of convex approximation in non-hilbert spaces
Authors:M. J. Donahue  C. Darken  L. Gurvits  E. Sontag
Affiliation:(1) Institute for Mathematics and Its Applications, University of Minnesota, 55455 Minneapolis, MN, USA;(2) Learning Systems Department, Siemens Corporate Research, Inc., 755 College Road East, 08540 Princeton, NJ, USA;(3) NEC Research Institute, 4 Independence Way, 08540 Princeton, NJ, USA;(4) Department of Mathematics, Rutgers University, 08903 New Brunswick, NJ, USA
Abstract:This paper deals with sparse approximations by means of convex combinations of elements from a predetermined “basis” subsetS of a function space. Specifically, the focus is on therate at which the lowest achievable error can be reduced as larger subsets ofS are allowed when constructing an approximant. The new results extend those given for Hilbert spaces by Jones and Barron, including, in particular, a computationally attractive incremental approximation scheme. Bounds are derived for broad classes of Banach spaces; in particular, forL p spaces with 1<p<∞, theO (n −1/2) bounds of Barron and Jones are recovered whenp=2. One motivation for the questions studied here arises from the area of “artificial neural networks,” where the problem can be stated in terms of the growth in the number of “neurons” (the elements ofS) needed in order to achieve a desired error rate. The focus on non-Hilbert spaces is due to the desire to understand approximation in the more “robust” (resistant to exemplar noise)L p, 1 ≤p<2, norms. The techniques used borrow from results regarding moduli of smoothness in functional analysis as well as from the theory of stochastic processes on function spaces.
Keywords:  KeywordHeading"  >AMS classification 41 A25  46B09  68T05
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号