The Annals of Statistics

Adaptive Prediction and Estimation in Linear Regression with Infinitely Many Parameters

A. Goldenshluger and A. Tsybakov

Full-text: Open access


The problem of adaptive prediction and estimation in the stochastic linear regression model with infinitely many parameters is considered.We suggest a prediction method that is sharp asymptotically minimax adaptive over ellipsoids in $\ell_2$. The method consists in an application of blockwise Stein’s rule with “weakly” geometrically increasing blocks to the penalized least squares fits of the first $N$ coefficients. To prove the results we develop oracle inequalities for a sequence model with correlated data.

Article information

Ann. Statist., Volume 29, Number 6 (2001), 1601-1619.

First available in Project Euclid: 5 March 2002

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62G05: Estimation 62G20: Asymptotic properties

Linear regression with infinitely many parameters adaptive prediction exact asyptotics of minimax risk blockwise Stein’s rule oracle inequalities


Goldenshluger, A.; Tsybakov, A. Adaptive Prediction and Estimation in Linear Regression with Infinitely Many Parameters. Ann. Statist. 29 (2001), no. 6, 1601--1619. doi:10.1214/aos/1015345956.

Export citation


  • Belitser, E. N. and Levit, B. Ya. (1995). On minimax filtering on ellipsoids. Math. Method Statist. 4 259-273.
  • Breiman, L. and Freedman, D. (1983). How manyvariables should be entered in a regression equation? J. Amer. Statist. Assoc. 78 131-136.
  • Cai, T. (1999). Adaptive wavelet estimation: a block thresholding and oracle inequalityapproach. Ann. Statist. 27 898-924.
  • Cavalier, L. and Tsybakov, A. (2000). Sharp adaptation for inverse problems with random noise. Probab. Theory Related Fields. To appear. Available at
  • Donoho, D. and Johnstone, I. M. (1995). Adapting to unknown smoothness via wavelet shrinkage. J. Amer. Statist. Assoc. 90 1200-1224.
  • Efromovich, S. (1999). Nonparametric Curve Estimation. Springer, New York.
  • Efroimovich, S. Yu. and Pinsker, M. S. (1984). Learning algorithm for nonparametric filtering. Automat. Remote Control, 11 1434-1440.
  • Efromovich, S. and Pinsker, M. S. (1996). Sharp-optimal and adaptive estimation for heteroscedastic nonparametric regression. Statist. Sinica 6 925-942.
  • Goldenshluger, A. and Tsybakov, A. (1999). Optimal prediction for linear regression with infinitelymanyparameters. Available at
  • Johnstone, I. M. (1998). Function estimation in Gaussian noise: sequence models. Available at
  • Johnstone, I. M. (1999). Wavelet shrinkage for correlated data and inverse problems: adaptivity results. Statist. Sinica 9 51-83.
  • Lehmann, E. and Casella, G. (1998). Theory of Point Estimation. Springer, New York.
  • Marcus, M. and Minc, H. (1992). A Survey of Matrix Theory and Matrix Inequalities. Dover, New York.
  • Nemirovski, A. (2000). Topics in Non-Parametric Statistics. Ecole d'et´e de Probabilit´es Saint Flour XXVIII. Lecture Notes in Math. 1738 89-277. Springer, Berlin.
  • Petrov, V. V. (1995). Limit Theorems of Probability Theory. Clarendon Press, Oxford.
  • Pinsker, M. S. (1980). Optimal filtering of square integrable signals in Gaussian white noise. Problems Inform. Transmission 16 120-133.
  • Shibata, R. (1981). An optimal selection of regression variables. Biometrika 68 45-54.
  • Stein, Ch. (1981). Estimation of the mean of a multivariate normal distribution. Ann. Statist. 9 1135-1151.