Abstract
Ridge regression is a well-known technique to estimate the coefficients of a linear model. The method of regularization is a similar approach commonly used to solve underdetermined linear equations with discrete noisy data. When applying such a technique, the choice of the smoothing (or regularization) parameter $h$ is crucial. Generalized cross-validation (GCV) and Mallows' $C_L$ are two popular methods for estimating a good value for $h,$ from the data. Their asymptotic properties, such as consistency and asymptotic optimality, have been largely studied [Craven and Wahba (1979); Golub, Heath and Wahba (1979); Speckman (1985)]. Very interesting convergence results for the actual (random) parameter given by GCV and $C_L$ have been shown by Li (1985, 1986). Recently, Girard (1987, 1989) has proposed fast randomized versions of GCV and $C_L.$ The purpose of this paper is to show that the above convergence results also hold for these new methods.
Citation
Didier A. Girard. "Asymptotic Optimality of the Fast Randomized Versions of GCV and $C_L$ in Ridge Regression and Regularization." Ann. Statist. 19 (4) 1950 - 1963, December, 1991. https://doi.org/10.1214/aos/1176348380
Information