Open Access
April 2018 I-LAMM for sparse learning: Simultaneous control of algorithmic complexity and statistical error
Jianqing Fan, Han Liu, Qiang Sun, Tong Zhang
Ann. Statist. 46(2): 814-841 (April 2018). DOI: 10.1214/17-AOS1568

Abstract

We propose a computational framework named iterative local adaptive majorize-minimization (I-LAMM) to simultaneously control algorithmic complexity and statistical error when fitting high-dimensional models. I-LAMM is a two-stage algorithmic implementation of the local linear approximation to a family of folded concave penalized quasi-likelihood. The first stage solves a convex program with a crude precision tolerance to obtain a coarse initial estimator, which is further refined in the second stage by iteratively solving a sequence of convex programs with smaller precision tolerances. Theoretically, we establish a phase transition: the first stage has a sublinear iteration complexity, while the second stage achieves an improved linear rate of convergence. Though this framework is completely algorithmic, it provides solutions with optimal statistical performances and controlled algorithmic complexity for a large family of nonconvex optimization problems. The iteration effects on statistical errors are clearly demonstrated via a contraction property. Our theory relies on a localized version of the sparse/restricted eigenvalue condition, which allows us to analyze a large family of loss and penalty functions and provide optimality guarantees under very weak assumptions (e.g., I-LAMM requires much weaker minimal signal strength than other procedures). Thorough numerical results are provided to support the obtained theory.

Citation

Download Citation

Jianqing Fan. Han Liu. Qiang Sun. Tong Zhang. "I-LAMM for sparse learning: Simultaneous control of algorithmic complexity and statistical error." Ann. Statist. 46 (2) 814 - 841, April 2018. https://doi.org/10.1214/17-AOS1568

Information

Received: 1 July 2015; Revised: 1 March 2017; Published: April 2018
First available in Project Euclid: 3 April 2018

zbMATH: 06870280
MathSciNet: MR3782385
Digital Object Identifier: 10.1214/17-AOS1568

Subjects:
Primary: 62J07
Secondary: 62C20 , 62H35

Keywords: algorithmic statistics , iteration complexity , local adaptive MM , nonconvex statistical optimization , Optimal rate of convergence

Rights: Copyright © 2018 Institute of Mathematical Statistics

Vol.46 • No. 2 • April 2018
Back to Top