The Annals of Statistics

Asymptotic equivalence of density estimation and Gaussian white noise

Michael Nussbaum

Full-text: Open access


Signal recovery in Gaussian white noise with variance tending to zero has served for some time as a representative model for nonparametric curve estimation, having all the essential traits in a pure form. The equivalence has mostly been stated informally, but an approximation in the sense of Le Cam's deficiency distance $\Delta$ would make it precise. The models are then asymptotically equivalent for all purposes of statistical decision with bounded loss. In nonparametrics, a first result of this kind has recently been established for Gaussian regression. We consider the analogous problem for the experiment given by n i.i.d. observations having density f on the unit interval. Our basic result concerns the parameter space of densities which are in a Hölder ball with exponent $\alpha > 1/2$ and which are uniformly bounded away from zero. We show that an i. i. d. sample of size n with density f is globally asymptotically equivalent to a white noise experiment with drift $f^{1/2}$ and variance $1/4 n^{-1}$. This represents a nonparametric analog of Le Cam's heteroscedastic Gaussian approximation in the finite dimensional case. The proof utilizes empirical process techniques related to the Hungarian construction. White noise models on f and log f are also considered, allowing for various "automatic" asymptotic risk bounds in the i.i.d. model from white noise.

Article information

Ann. Statist., Volume 24, Number 6 (1996), 2399-2430.

First available in Project Euclid: 16 September 2002

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 62G07: Density estimation
Secondary: 62B15: Theory of statistical experiments 62G20: Asymptotic properties

Nonparametric experiments deficiency distance likelihood process Hungarian construction asymptotic minimax risk curve estimation


Nussbaum, Michael. Asymptotic equivalence of density estimation and Gaussian white noise. Ann. Statist. 24 (1996), no. 6, 2399--2430. doi:10.1214/aos/1032181160.

Export citation


  • Belitser, E. and Levit, B. (1995). On minimax filtering over ellipsoids. Math. Methods Statist. 4 259-273.
  • Brown, L. D. and Low, M. (1996). Asy mptotic equivalence of nonparametric regression and white noise. Ann. Statist. 24 2384-2398.
  • Donoho, D. (1994). Asy mptotic minimax risk (for sup-norm loss): solution via optimal recovery. Probab. Theory Related Fields 99 145-170.
  • Donoho, D. L. and Johnstone, I. (1992). Minimax estimation via wavelet shrinkage. Unpublished manuscript.
  • Donoho, D. L. and Low, M. (1992). Renormalization exponents and optimal pointwise rates of convergence. Ann. Statist. 20 944-970.
  • Dudley, R. (1989). Real Analy sis and Probability. Wadsworth & Brooks/Cole, Pacific Grove, CA.
  • Efroimovich, S. Yu. and Pinsker, M. S. (1982). Estimating a square integrable probability density of a random variable. Problems Inform. Transmission 18 172-189.
  • Falk, M. and Reiss, R.-D. (1992). Poisson approximation of empirical processes. Statist. Probab. Lett. 14 39-48.
  • Golubev, G. K. (1984). On minimax estimation of regression. Problems Inform. Transmission 20 56-64. (In Russian.)
  • Golubev, G. K. (1991). LAN in problems of nonparametric estimation of functions and lower bounds for quadratic risks. Theory Probab. Appl. 36 152-157.
  • Ibragimov, I. A. and Khasminski, R. Z. (1977). On the estimation of an infinite dimensional parameter in Gaussian white noise. Soviet Math. Dokl. 236 1053-1055.
  • Koltchinskii, V. (1994). Komlos-Major-Tusnady approximation for the general empirical process and Haar expansions of classes of functions. J. Theoret. Probab. 7 73-118.
  • Korostelev, A. P. (1993). An asy mptotically minimax regression estimate in the uniform norm up to an exact constant. Theory Probab. Appl. 38 737-743.
  • Korostelev, A. P. and Nussbaum, M. (1996). The asy mptotic minimax constant for sup-norm loss in nonparametric density estimation. Discussion paper, SFB 373, Humboldt Univ., Berlin.
  • Le Cam, L. (1985). Sur l'approximation de familles de mesures par des familles gaussiennes. Ann. Inst. H. Poincar´e 21 225-287.
  • Le Cam, L. (1986). Asy mptotic Methods in Statistical Decision Theory. Springer, New York.
  • Le Cam, L. and Yang, G. (1990). Asy mptotics in Statistics. Springer, New York.
  • Low, M. (1992). Renormalization and white noise approximation for nonparametric functional estimation problems. Ann. Statist. 20 545-554.
  • Mammen, E. (1986). The statistical information contained in additional observations. Ann. Statist. 14 665-678.
  • Millar, P. W. (1979). Asy mptotic minimax theorems for the sample distribution function. Z. Wahrsch. Verw. Gebiete 48 233-252.
  • Nikolskij, S. M. (1975). Approximation of Functions of Several Variables and Imbedding Theorems. Springer, Berlin.
  • Nussbaum, M. (1985). Spline smoothing in regression models and asy mptotic efficiency in L2. Ann. Statist. 13 984-997.
  • Parthasarathy, K. R. (1978). Introduction to Probability and Measure. Springer, New York.
  • Pinsker, M. S. (1980). Optimal filtering of square integrable signals in Gaussian white noise. Problems Inform. Transmission 16 120-133.
  • Reiss, R.-D. (1993). A Course on Point Processes. Springer, New York.
  • Rio, E. (1994). Local invariance principles and their application to density estimation. Probab. Theory Related Fields 98 21-45.
  • Shorack, G. and Wellner, J. (1986). Empirical Processes with Applications to Statistics. Wiley, New York.
  • Strasser, H. (1985). Mathematical Theory of Statistics. de Gruy ter, Berlin.
  • Tsy bakov, A. B. (1994). Efficient nonparametric estimation in L2 with general loss. Unpublished manuscript.
  • Woodroofe, M. (1967). On the maximum deviation of the sample density. Ann. Math. Statist. 38 475-481.