Abstract: | The paper revisits the problem of selection of priors for regular one-parameter family of distributions. The goal is to find some “objective” or “default” prior by approximate maximization of the distance between the prior and the posterior under a general divergence criterion as introduced by Amari (Ann Stat 10:357–387, 1982) and Cressie and Read (J R Stat Soc Ser B 46:440–464, 1984). The maximization is based on an asymptotic expansion of this distance. The Kullback–Leibler, Bhattacharyya–Hellinger and Chi-square divergence are special cases of this general divergence criterion. It is shown that with the exception of one particular case, namely the Chi-square divergence, the general divergence criterion yields Jeffreys’ prior. For the Chi-square divergence, we obtain a prior different from that of Jeffreys and also from that of Clarke and Sun (Sankhya Ser A 59:215–231, 1997). |