The Annals of Statistics
- Ann. Statist.
- Volume 15, Number 4 (1987), 1491-1519.
On Kullback-Leibler Loss and Density Estimation
"Discrimination information," or Kullback-Leibler loss, is an appropriate measure of distance in problems of discrimination. We examine it in the context of nonparametric kernel density estimation and show that its asymptotic properties are profoundly influenced by tail properties of the kernel and of the unknown density. We suggest ways of choosing the kernel so as to reduce loss, and describe the extent to which likelihood cross-validation asymptotically minimises loss. Likelihood cross-validation generally leads to selection of a window width of the correct order of magnitude, but not necessarily to a window with the correct first-order properties. However, if the kernel is chosen appropriately, then likelihood cross-validation does result in asymptotic minimisation of Kullback-Leibler loss.
Ann. Statist., Volume 15, Number 4 (1987), 1491-1519.
First available in Project Euclid: 12 April 2007
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Hall, Peter. On Kullback-Leibler Loss and Density Estimation. Ann. Statist. 15 (1987), no. 4, 1491--1519. doi:10.1214/aos/1176350606. https://projecteuclid.org/euclid.aos/1176350606