Open Access
2021 Estimating multi-index models with response-conditional least squares
Timo Klock, Alessandro Lanteri, Stefano Vigogna
Electron. J. Statist. 15(1): 589-629 (2021). DOI: 10.1214/20-EJS1785

Abstract

The multi-index model is a simple yet powerful high-dimensional regression model which circumvents the curse of dimensionality assuming $\mathbb{E}[Y|X]=g(A^{\top }X)$ for some unknown index space $A$ and link function $g$. In this paper we introduce a method for the estimation of the index space, and study the propagation error of an index space estimate in the regression of the link function. The proposed method approximates the index space by the span of linear regression slope coefficients computed over level sets of the data. Being based on ordinary least squares, our approach is easy to implement and computationally efficient. We prove a tight concentration bound that shows $N^{-1/2}$-convergence, but also faithfully describes the dependence on the chosen partition of level sets, hence providing guidance on the hyperparameter tuning. The estimator’s competitiveness is confirmed by extensive comparisons with state-of-the-art methods, both on synthetic and real data sets. As a second contribution, we establish minimax optimal generalization bounds for k-nearest neighbors and piecewise polynomial regression when trained on samples projected onto any $N^{-1/2}$-consistent estimate of the index space, thus providing complete and provable estimation of the multi-index model.

Citation

Download Citation

Timo Klock. Alessandro Lanteri. Stefano Vigogna. "Estimating multi-index models with response-conditional least squares." Electron. J. Statist. 15 (1) 589 - 629, 2021. https://doi.org/10.1214/20-EJS1785

Information

Received: 1 June 2020; Published: 2021
First available in Project Euclid: 19 January 2021

Digital Object Identifier: 10.1214/20-EJS1785

Subjects:
Primary: 62G05
Secondary: 62G08 , 62H99

Keywords: finite sample bounds , multi-index model , Nonparametric regression , sufficient dimension reduction

Vol.15 • No. 1 • 2021
Back to Top