Estimation of Kullback–Leibler Divergence by Local Likelihood |
| |
Authors: | Young Kyung Lee Byeong U Park |
| |
Institution: | (1) Department of Statistics, Seoul National University, Seoul, 151-747, South Korea |
| |
Abstract: | Motivated from the bandwidth selection problem in local likelihood density estimation and from the problem of assessing a
final model chosen by a certain model selection procedure, we consider estimation of the Kullback–Leibler divergence. It is
known that the best bandwidth choice for the local likelihood density estimator depends on the distance between the true density
and the ‘vehicle’ parametric model. Also, the Kullback–Leibler divergence may be a useful measure based on which one judges
how far the true density is away from a parametric family. We propose two estimators of the Kullback-Leibler divergence. We
derive their asymptotic distributions and compare finite sample properties.
Research of Young Kyung Lee was supported by the Brain Korea 21 Projects in 2004. Byeong U. Park’s research was supported
by KOSEF through Statistical Research Center for Complex Systems at Seoul National University. |
| |
Keywords: | Kernel smoothing Local likelihood density estimation Bandwidth Kullback– Leibler divergence |
本文献已被 SpringerLink 等数据库收录! |