Open Access
August 2005 A new class of generalized Bayes minimax ridge regression estimators
Yuzo Maruyama, William E. Strawderman
Ann. Statist. 33(4): 1753-1770 (August 2005). DOI: 10.1214/009053605000000327

Abstract

Let y=Aβ+ɛ, where y is an N×1 vector of observations, β is a p×1 vector of unknown regression coefficients, A is an N×p design matrix and ɛ is a spherically symmetric error term with unknown scale parameter σ. We consider estimation of β under general quadratic loss functions, and, in particular, extend the work of Strawderman [J. Amer. Statist. Assoc. 73 (1978) 623–627] and Casella [Ann. Statist. 8 (1980) 1036–1056, J. Amer. Statist. Assoc. 80 (1985) 753–758] by finding adaptive minimax estimators (which are, under the normality assumption, also generalized Bayes) of β, which have greater numerical stability (i.e., smaller condition number) than the usual least squares estimator. In particular, we give a subclass of such estimators which, surprisingly, has a very simple form. We also show that under certain conditions the generalized Bayes minimax estimators in the normal case are also generalized Bayes and minimax in the general case of spherically symmetric errors.

Citation

Download Citation

Yuzo Maruyama. William E. Strawderman. "A new class of generalized Bayes minimax ridge regression estimators." Ann. Statist. 33 (4) 1753 - 1770, August 2005. https://doi.org/10.1214/009053605000000327

Information

Published: August 2005
First available in Project Euclid: 5 August 2005

zbMATH: 1078.62006
MathSciNet: MR2166561
Digital Object Identifier: 10.1214/009053605000000327

Subjects:
Primary: 62C10 , 62C15 , 62C20
Secondary: 62A15

Keywords: condition number , generalized Bayes , minimaxity , Ridge regression

Rights: Copyright © 2005 Institute of Mathematical Statistics

Vol.33 • No. 4 • August 2005
Back to Top