Open Access
September, 1984 Parametric Robustness: Small Biases can be Worthwhile
P. J. Bickel
Ann. Statist. 12(3): 864-879 (September, 1984). DOI: 10.1214/aos/1176346707

Abstract

We study estimation of the parameters of a Gaussian linear model $\mathscr{M}_0$ when we entertain the possibility that $\mathscr{M}_0$ is invalid and a larger model $\mathscr{M}_1$ should be assumed. Estimates are robust if their maximum risk over $\mathscr{M}_1$ is finite and the most robust estimate is the least squares estimate under $\mathscr{M}_1$. We apply notions of Hodges and Lehmann (1952) and Efron and Morris (1971) to obtain (biased) estimates which do well under $\mathscr{M}_0$ at a small price in robustness. Extensions to confidence intervals, simultaneous estimation of several parameters and large sample approximations applying to nested parametric models are also discussed.

Citation

Download Citation

P. J. Bickel. "Parametric Robustness: Small Biases can be Worthwhile." Ann. Statist. 12 (3) 864 - 879, September, 1984. https://doi.org/10.1214/aos/1176346707

Information

Published: September, 1984
First available in Project Euclid: 12 April 2007

zbMATH: 0545.62028
MathSciNet: MR751278
Digital Object Identifier: 10.1214/aos/1176346707

Subjects:
Primary: 62F10
Secondary: 62F25

Keywords: confidence intervals , limited translation estimates , Parametric robustness , pretesting

Rights: Copyright © 1984 Institute of Mathematical Statistics

Vol.12 • No. 3 • September, 1984
Back to Top