Open Access
2010 Dimension reduction for regression estimation with nearest neighbor method
Benoît Cadre, Qian Dong
Electron. J. Statist. 4: 436-460 (2010). DOI: 10.1214/09-EJS559

Abstract

In regression with a high-dimensional predictor vector, dimension reduction methods aim at replacing the predictor by a lower dimensional version without loss of information on the regression. In this context, the so-called central mean subspace is the key of dimension reduction. The last two decades have seen the emergence of many methods to estimate the central mean subspace. In this paper, we go one step further, and we study the performances of a k-nearest neighbor type estimate of the regression function, based on an estimator of the central mean subspace. In our setting, the predictor lies in ℝp with fixed p, i.e. it does not depend on the sample size. The estimate is first proved to be consistent. Improvement due to the dimension reduction step is then observed in term of its rate of convergence. All the results are distributions-free. As an application, we give an explicit rate of convergence using the SIR method. The method is illustrated by a simulation study.

Citation

Download Citation

Benoît Cadre. Qian Dong. "Dimension reduction for regression estimation with nearest neighbor method." Electron. J. Statist. 4 436 - 460, 2010. https://doi.org/10.1214/09-EJS559

Information

Published: 2010
First available in Project Euclid: 30 April 2010

zbMATH: 1329.62254
MathSciNet: MR2645492
Digital Object Identifier: 10.1214/09-EJS559

Subjects:
Primary: 62G08 , 62H12

Keywords: Central mean subspace , Dimension reduction , nearest neighbor method , semiparametric regression , SIR method

Rights: Copyright © 2010 The Institute of Mathematical Statistics and the Bernoulli Society

Back to Top