Abstract
We investigate the asymptotic normality of the posterior distribution in the discrete setting, when model dimension increases with sample size. We consider a probability mass function θ0 on ℕ∖{0} and a sequence of truncation levels (kn)n satisfying kn3≤ninf i≤knθ0(i). Let θ̂ denote the maximum likelihood estimate of (θ0(i))i≤kn and let Δn(θ0) denote the kn-dimensional vector which i-th coordinate is defined by $\sqrt{n}(\hat{\theta}_{n}(i)-\theta_{0}(i))$ for 1≤i≤kn. We check that under mild conditions on θ0 and on the sequence of prior probabilities on the kn-dimensional simplices, after centering and rescaling, the variation distance between the posterior distribution recentered around θ̂n and rescaled by $\sqrt{n}$ and the kn-dimensional Gaussian distribution $\mathcal{N}(\Delta_{n}(\theta_{0}),I^{-1}(\theta_{0}))$ converges in probability to 0. This theorem can be used to prove the asymptotic normality of Bayesian estimators of Shannon and Rényi entropies.
The proofs are based on concentration inequalities for centered and non-centered Chi-square (Pearson) statistics. The latter allow to establish posterior concentration rates with respect to Fisher distance rather than with respect to the Hellinger distance as it is commonplace in non-parametric Bayesian statistics.
Citation
S. Boucheron. E. Gassiat. "A Bernstein-Von Mises Theorem for discrete probability distributions." Electron. J. Statist. 3 114 - 148, 2009. https://doi.org/10.1214/08-EJS262
Information