首页 | 本学科首页   官方微博 | 高级检索  
     


A general divergence criterion for prior selection
Authors:Malay?Ghosh  author-information"  >  author-information__contact u-icon-before"  >  mailto:ghoshm@stat.ufl.edu"   title="  ghoshm@stat.ufl.edu"   itemprop="  email"   data-track="  click"   data-track-action="  Email author"   data-track-label="  "  >Email author,Victor?Mergel,Ruitao?Liu
Affiliation:1.Department of Statistics,University of Florida,Gainesville,USA;2.Bristol Myers Squibb,Princeton,USA
Abstract:The paper revisits the problem of selection of priors for regular one-parameter family of distributions. The goal is to find some “objective” or “default” prior by approximate maximization of the distance between the prior and the posterior under a general divergence criterion as introduced by Amari (Ann Stat 10:357–387, 1982) and Cressie and Read (J R Stat Soc Ser B 46:440–464, 1984). The maximization is based on an asymptotic expansion of this distance. The Kullback–Leibler, Bhattacharyya–Hellinger and Chi-square divergence are special cases of this general divergence criterion. It is shown that with the exception of one particular case, namely the Chi-square divergence, the general divergence criterion yields Jeffreys’ prior. For the Chi-square divergence, we obtain a prior different from that of Jeffreys and also from that of Clarke and Sun (Sankhya Ser A 59:215–231, 1997).
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号