Open Access
2019 Isotonic regression meets LASSO
Matey Neykov
Electron. J. Statist. 13(1): 710-746 (2019). DOI: 10.1214/19-EJS1537

Abstract

This paper studies a two step procedure for monotone increasing additive single index models with Gaussian designs. The proposed procedure is simple, easy to implement with existing software, and consists of consecutively applying LASSO and isotonic regression. Aside from formalizing this procedure, we provide theoretical guarantees regarding its performance: 1) we show that our procedure controls the in-sample squared error; 2) we demonstrate that one can use the procedure for predicting new observations, by showing that the absolute prediction error can be controlled with high-probability. Our bounds show a tradeoff of two rates: the minimax rate for estimating high dimensional quadratic loss, and the minimax nonparametric rate for estimating a monotone increasing function.

Citation

Download Citation

Matey Neykov. "Isotonic regression meets LASSO." Electron. J. Statist. 13 (1) 710 - 746, 2019. https://doi.org/10.1214/19-EJS1537

Information

Received: 1 February 2018; Published: 2019
First available in Project Euclid: 20 February 2019

zbMATH: 07038002
MathSciNet: MR3914933
Digital Object Identifier: 10.1214/19-EJS1537

Keywords: high dimensional statistics , isotonic regression , Lasso , Monotone single index models , Sparsity

Vol.13 • No. 1 • 2019
Back to Top