首页 | 本学科首页   官方微博 | 高级检索  
     检索      


A new class of metric divergences on probability spaces and its applicability in statistics
Authors:Ferdinand Österreicher  Igor Vajda
Institution:(1) Institute of Mathematics, University of Salzburg, 5020 Salzburg, Austria;(2) Institute of Information Theory and Automation, Academy of Sciences, 18208 Prague, Czech Republic
Abstract:The classI f β, βε(0, ∞], off-divergences investigated in this paper is defined in terms of a class of entropies introduced by Arimoto (1971,Information and Control,19, 181–194). It contains the squared Hellinger distance (for β=1/2), the sumI(Q 1‖(Q 1+Q 2)/2)+I(Q 2‖(Q 1+Q 2)/2) of Kullback-Leibler divergences (for β=1) and half of the variation distance (for β=∞) and continuously extends the class of squared perimeter-type distances introduced by Österreicher (1996,Kybernetika,32, 389–393) (for βε (1, ∞]). It is shown that\((I_{f_\beta } (Q_1 ,Q_2 ))^{\min (\beta ,1/2)}\) are distances of probability distributionsQ 1,Q 2 for β ε (0, ∞). The applicability of\(I_{f_\beta }\)-divergences in statistics is also considered. In particular, it is shown that the\(I_{f_\beta }\)-projections of appropriate empirical distributions to regular families define distribution estimates which are in the case of an i.i.d. sample of size'n consistent. The order of consistency is investigated as well.
Keywords:Dissimilarities  metric divergences  minimum distance estimators
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号