Abstract
We present theoretical properties of the log-concave maximum likelihood estimator of a density based on an independent and identically distributed sample in ℝd. Our study covers both the case where the true underlying density is log-concave, and where this model is misspecified. We begin by showing that for a sequence of log-concave densities, convergence in distribution implies much stronger types of convergence – in particular, it implies convergence in Hellinger distance and even in certain exponentially weighted total variation norms. In our main result, we prove the existence and uniqueness of a log-concave density that minimises the Kullback–Leibler divergence from the true density over the class of all log-concave densities, and also show that the log-concave maximum likelihood estimator converges almost surely in these exponentially weighted total variation norms to this minimiser. In the case of a correctly specified model, this demonstrates a strong type of consistency for the estimator; in a misspecified model, it shows that the estimator converges to the log-concave density that is closest in the Kullback–Leibler sense to the true density.
Citation
Madeleine Cule. Richard Samworth. "Theoretical properties of the log-concave maximum likelihood estimator of a multidimensional density." Electron. J. Statist. 4 254 - 270, 2010. https://doi.org/10.1214/09-EJS505
Information