Open Access
November, 1980 Asymptotic Lower Bounds for Risk in Robust Estimation
Rudolf Beran
Ann. Statist. 8(6): 1252-1264 (November, 1980). DOI: 10.1214/aos/1176345198

Abstract

Robustness and efficiency of a parameter estimate $T$ can be assessed by comparing the fitted parametric distribution $P_T$ with the actual distribution, which is assumed to lie near the parametric family $\{P_\theta:\theta\in\Theta\}$. Asymptotic lower bounds are established for the minimax risk over distributions near the parametric model, taking as loss function a monotone increasing function of the Hellinger distance between the actual distribution of the sample and the fitted distribution determined by $T$. The set of marginal distributions considered in the minimax calculation is a subset of the Hellinger ball of radius $O(n^{-1/2})$ centered at $P_\theta, n$ being the sample size. When the loss function is bounded, the lower bound on maximum risk can be attained asymptotically. However, an estimator of $\theta$ which is asymptotically minimax for bounded loss functions may be far from optimal when the loss function is unbounded. Such divergent behavior is exhibited, for instance, by the sample mean in nearly normal models.

Citation

Download Citation

Rudolf Beran. "Asymptotic Lower Bounds for Risk in Robust Estimation." Ann. Statist. 8 (6) 1252 - 1264, November, 1980. https://doi.org/10.1214/aos/1176345198

Information

Published: November, 1980
First available in Project Euclid: 12 April 2007

zbMATH: 0453.62032
MathSciNet: MR594642
Digital Object Identifier: 10.1214/aos/1176345198

Subjects:
Primary: 62G35
Secondary: 62F10

Keywords: asymptotic minimax bounds , asymptotic minimax estimators , Parametric models , risk , robust estimation

Rights: Copyright © 1980 Institute of Mathematical Statistics

Vol.8 • No. 6 • November, 1980
Back to Top