Abstract
Mallows has conjectured that among distributions which are Gaussian but for occasional contamination by additive noise, the one having least Fisher information has (two-sided) geometric contamination. A very similar problem arises in estimation of a nonnegative vector parameter in Gaussian white noise when it is known also that most [i.e., $(1 - \varepsilon)$] components are zero. We provide a partial asymptotic expansion of the minimax risk as $\varepsilon \rightarrow 0$. While the conjecture seems unlikely to be exactly true for finite $\varepsilon$, we verify it asymptotically up to the accuracy of the expansion. Numerical work suggests the expansion is accurate for $\varepsilon$ as large as 0.05. The best $l_1$-estimation rule is first- but not second-order minimax. The results bear on an earlier study of maximum entropy estimation and various questions in robustness and function estimation using wavelet bases.
Citation
Iain M. Johnstone. "On Minimax Estimation of a Sparse Normal Mean Vector." Ann. Statist. 22 (1) 271 - 289, March, 1994. https://doi.org/10.1214/aos/1176325368
Information