The Annals of Statistics

On Kullback-Leibler Loss and Density Estimation

Peter Hall

Full-text: Open access


"Discrimination information," or Kullback-Leibler loss, is an appropriate measure of distance in problems of discrimination. We examine it in the context of nonparametric kernel density estimation and show that its asymptotic properties are profoundly influenced by tail properties of the kernel and of the unknown density. We suggest ways of choosing the kernel so as to reduce loss, and describe the extent to which likelihood cross-validation asymptotically minimises loss. Likelihood cross-validation generally leads to selection of a window width of the correct order of magnitude, but not necessarily to a window with the correct first-order properties. However, if the kernel is chosen appropriately, then likelihood cross-validation does result in asymptotic minimisation of Kullback-Leibler loss.

Article information

Ann. Statist., Volume 15, Number 4 (1987), 1491-1519.

First available in Project Euclid: 12 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier


Primary: 62G99: None of the above, but in this section
Secondary: 62H99: None of the above, but in this section

Density estimation discrimination kernel method Kullback-Leibler loss likelihood cross-validation


Hall, Peter. On Kullback-Leibler Loss and Density Estimation. Ann. Statist. 15 (1987), no. 4, 1491--1519. doi:10.1214/aos/1176350606.

Export citation