Open Access
December 2012 Asymptotic optimality and efficient computation of the leave-subject-out cross-validation
Ganggang Xu, Jianhua Z. Huang
Ann. Statist. 40(6): 3003-3030 (December 2012). DOI: 10.1214/12-AOS1063

Abstract

Although the leave-subject-out cross-validation (CV) has been widely used in practice for tuning parameter selection for various nonparametric and semiparametric models of longitudinal data, its theoretical property is unknown and solving the associated optimization problem is computationally expensive, especially when there are multiple tuning parameters. In this paper, by focusing on the penalized spline method, we show that the leave-subject-out CV is optimal in the sense that it is asymptotically equivalent to the empirical squared error loss function minimization. An efficient Newton-type algorithm is developed to compute the penalty parameters that optimize the CV criterion. Simulated and real data are used to demonstrate the effectiveness of the leave-subject-out CV in selecting both the penalty parameters and the working correlation matrix.

Citation

Download Citation

Ganggang Xu. Jianhua Z. Huang. "Asymptotic optimality and efficient computation of the leave-subject-out cross-validation." Ann. Statist. 40 (6) 3003 - 3030, December 2012. https://doi.org/10.1214/12-AOS1063

Information

Published: December 2012
First available in Project Euclid: 8 February 2013

zbMATH: 1296.62096
MathSciNet: MR3097967
Digital Object Identifier: 10.1214/12-AOS1063

Subjects:
Primary: 62G08
Secondary: 41A15 , 62G05 , 62G20 , 62H12

Keywords: cross-validation , generalized estimating equations , multiple smoothing parameters , penalized splines , working correlation matrices

Rights: Copyright © 2012 Institute of Mathematical Statistics

Vol.40 • No. 6 • December 2012
Back to Top