Open Access
June 2017 Nonlinear sufficient dimension reduction for functional data
Bing Li, Jun Song
Ann. Statist. 45(3): 1059-1095 (June 2017). DOI: 10.1214/16-AOS1475


We propose a general theory and the estimation procedures for nonlinear sufficient dimension reduction where both the predictor and the response may be random functions. The relation between the response and predictor can be arbitrary and the sets of observed time points can vary from subject to subject. The functional and nonlinear nature of the problem leads to construction of two functional spaces: the first representing the functional data, assumed to be a Hilbert space, and the second characterizing nonlinearity, assumed to be a reproducing kernel Hilbert space. A particularly attractive feature of our construction is that the two spaces are nested, in the sense that the kernel for the second space is determined by the inner product of the first. We propose two estimators for this general dimension reduction problem, and establish the consistency and convergence rate for one of them. These asymptotic results are flexible enough to accommodate both fully and partially observed functional data. We investigate the performances of our estimators by simulations, and applied them to data sets about speech recognition and handwritten symbols.


Download Citation

Bing Li. Jun Song. "Nonlinear sufficient dimension reduction for functional data." Ann. Statist. 45 (3) 1059 - 1095, June 2017.


Received: 1 December 2015; Published: June 2017
First available in Project Euclid: 13 June 2017

zbMATH: 1371.62003
MathSciNet: MR3662448
Digital Object Identifier: 10.1214/16-AOS1475

Primary: 62B05 , 62G08 , 62G20 , 62H99

Keywords: convergence rate , handwriting data , linear operator , ‎reproducing kernel Hilbert ‎space , Sliced Average Variance Estimator , sliced inverse regression , speech recognition

Rights: Copyright © 2017 Institute of Mathematical Statistics

Vol.45 • No. 3 • June 2017
Back to Top