Open Access
September, 1980 Minimax Ridge Regression Estimation
George Casella
Ann. Statist. 8(5): 1036-1056 (September, 1980). DOI: 10.1214/aos/1176345141

Abstract

The technique of ridge regression, first proposed by Hoerl and Kennard, has become a popular tool for data analysts faced with a high degree of multicollinearity in their data. By using a ridge estimator, one hopes to both stabilize one's estimates (lower the condition number of the design matrix) and improve upon the squared error loss of the least squares estimator. Recently, much attention has been focused on the latter objective. Building on the work of Stein and others, Strawderman and Thisted have developed classes of ridge regression estimators which dominate the usual estimator in risk, and hence are minimax. The unwieldy form of the risk function, however, has led these authors to minimax conditions which are stronger than needed. In this paper, using an entirely new method of proof, we derive conditions that are necessary and sufficient for minimaxity of a large class of ridge regression estimators. The conditions derived here are very similar to those derived for minimaxity of some Stein-type estimators. We also show, however, that if one forces a ridge regression estimator to satisfy the minimax conditions, it is quite likely that the other goal of Hoerl and Kennard (stability of the estimates) cannot be realized.

Citation

Download Citation

George Casella. "Minimax Ridge Regression Estimation." Ann. Statist. 8 (5) 1036 - 1056, September, 1980. https://doi.org/10.1214/aos/1176345141

Information

Published: September, 1980
First available in Project Euclid: 12 April 2007

zbMATH: 0492.62060
MathSciNet: MR585702
Digital Object Identifier: 10.1214/aos/1176345141

Subjects:
Primary: 62C99
Secondary: 62F10 , 62H99 , 62J05

Keywords: ‎mean‎ , minimax , normal distribution , quadratic loss , Ridge regression , risk function

Rights: Copyright © 1980 Institute of Mathematical Statistics

Vol.8 • No. 5 • September, 1980
Back to Top