Open Access
August 2009 General maximum likelihood empirical Bayes estimation of normal means
Wenhua Jiang, Cun-Hui Zhang
Ann. Statist. 37(4): 1647-1684 (August 2009). DOI: 10.1214/08-AOS638

Abstract

We propose a general maximum likelihood empirical Bayes (GMLEB) method for the estimation of a mean vector based on observations with i.i.d. normal errors. We prove that under mild moment conditions on the unknown means, the average mean squared error (MSE) of the GMLEB is within an infinitesimal fraction of the minimum average MSE among all separable estimators which use a single deterministic estimating function on individual observations, provided that the risk is of greater order than (log n)5/n. We also prove that the GMLEB is uniformly approximately minimax in regular and weak p balls when the order of the length-normalized norm of the unknown means is between (log n)κ1/n1/(p∧2) and n/(log n)κ2. Simulation experiments demonstrate that the GMLEB outperforms the James–Stein and several state-of-the-art threshold estimators in a wide range of settings without much down side.

Citation

Download Citation

Wenhua Jiang. Cun-Hui Zhang. "General maximum likelihood empirical Bayes estimation of normal means." Ann. Statist. 37 (4) 1647 - 1684, August 2009. https://doi.org/10.1214/08-AOS638

Information

Published: August 2009
First available in Project Euclid: 18 June 2009

zbMATH: 1168.62005
MathSciNet: MR2533467
Digital Object Identifier: 10.1214/08-AOS638

Subjects:
Primary: 62C12 , 62C25 , 62G05 , 62G08 , 62G20

Keywords: adaptive estimation , Compound estimation , Empirical Bayes , shrinkage estimator , threshold estimator , White noise

Rights: Copyright © 2009 Institute of Mathematical Statistics

Vol.37 • No. 4 • August 2009
Back to Top