The Annals of Statistics

Projection-Based Approximation and a Duality with Kernel Methods

David L. Donoho and Iain M. Johnstone

Full-text: Open access


Projection pursuit regression and kernel regression are methods for estimating a smooth function of several variables from noisy data obtained at scattered sites. Methods based on local averaging can perform poorly in high dimensions (curse of dimensionality). Intuition and examples have suggested that projection based approaches can provide better fits. For what sorts of regression functions is this true? When and by how much do projection methods reduce the curse of dimensionality? We make a start by focusing on the two-dimensional problem and study the $L^2$ approximation error (bias) of the two procedures with respect to Gaussian measure. Let RA stand for a certain PPR-type approximation and KA for a particular kernel-type approximation. Building on a simple but striking duality for polynomials, we show that RA behaves significantly better than the minimax rate of approximation for radial functions, while KA performs significantly better than the minimax rate for harmonic functions. In fact, the rate improvements carry over to large classes, RA behaving very well for functions with enough angular smoothness (oscillating slowly with angle), while KA behaves very well for functions with enough Laplacian smoothness, (oscillations averaging out locally). The rate improvements matter: They are equivalent to lowering the dimensionality of the problem. For example, for functions with nice tail behavior, RA behaves as if the dimensionality of the problem were 1.5 rather than its nominal value 2. Also, RA and KA are complementary: For a given function, if one method offers a dimensionality reduction, the other does not.

Article information

Ann. Statist., Volume 17, Number 1 (1989), 58-106.

First available in Project Euclid: 12 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier


Primary: 62J02: General nonlinear regression
Secondary: 62H99: None of the above, but in this section 41A10: Approximation by polynomials {For approximation by trigonometric polynomials, see 42A10} 41A25: Rate of convergence, degree of approximation 42C10: Fourier series in special orthogonal functions (Legendre polynomials, Walsh functions, etc.)

Projection pursuit regression kernel regression curse of dimensionality rates of convergence Hermite polynomials radial functions angular smoothness Laplacina smoothness


Donoho, David L.; Johnstone, Iain M. Projection-Based Approximation and a Duality with Kernel Methods. Ann. Statist. 17 (1989), no. 1, 58--106. doi:10.1214/aos/1176347004.

Export citation