首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Interpreting Kullback-Leibler divergence with the Neyman-Pearson lemma
Authors:Shinto Eguchi  John Copas
Institution:a Institute of Statistical Mathematics and Graduate University for Advanced Studies, Minami-Azabu, Minato-ku, Tokyo 106-8569, Japan
b Department of Statistics, University of Warwick, Coventry CV4 7AL, UK
Abstract:Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics. Both are about likelihood ratios: Kullback-Leibler divergence is the expected log-likelihood ratio, and the Neyman-Pearson lemma is about error rates of likelihood ratio tests. Exploring this connection gives another statistical interpretation of the Kullback-Leibler divergence in terms of the loss of power of the likelihood ratio test when the wrong distribution is used for one of the hypotheses. In this interpretation, the standard non-negativity property of the Kullback-Leibler divergence is essentially a restatement of the optimal property of likelihood ratios established by the Neyman-Pearson lemma. The asymmetry of Kullback-Leibler divergence is overviewed in information geometry.
Keywords:62A01  62F03
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号