首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Minimax revisited. II
Authors:B Levit
Institution:1.Dept. of Math. and Statist.,Queen’s University,Kingston,Canada
Abstract:The global lower bound for the minimax risk proposed in Part I 12] is applied to the pointwise estimation of functions in the white Gaussian noise, under the squared losses. Some general ellipsoidal and cuboidal functional classes are discussed, including classes of entire functions of exponential type, Paley-Wiener classes of analytic functions, Sobolev classes and their modifications. Based on the proposed risk bounds, a numerical comparison of the minimax risks and the linear minimax risks is made. A nonasymptotic comparison of different types of functional classes is facilitated by their respective embeddings provided the classes are properly calibrated. This discussion demonstrates that the commonly perceived notion of a close connection between the smoothness of an unknown function and the accuracy of estimation can be misleading in a nonasymptotic setting. In particular, the notion of optimal rates of convergence, which has dominated nonparametric statistics for the last three decades, may no longer be productive.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号