October 2021 Prediction bounds for higher order total variation regularized least squares
Francesco Ortelli, Sara van de Geer
Author Affiliations +
Ann. Statist. 49(5): 2755-2773 (October 2021). DOI: 10.1214/21-AOS2054

Abstract

We establish adaptive results for trend filtering: least squares estimation with a penalty on the total variation of (k1)th order differences. Our approach is based on combining a general oracle inequality for the 1-penalized least squares estimator with “interpolating vectors” to upper bound the “effective sparsity.” This allows one to show that the 1-penalty on the kth order differences leads to an estimator that can adapt to the number of jumps in the (k1)th order differences of the underlying signal or an approximation thereof. We show the result for k{1,2,3,4} and indicate how it could be derived for general kN.

Funding Statement

We acknowledge support for this project from the the Swiss National Science Foundation (SNF Grant 200020_169011).

Acknowledgments

We thank the Associate Editor and the referees for their very helpful remarks.

Citation

Download Citation

Francesco Ortelli. Sara van de Geer. "Prediction bounds for higher order total variation regularized least squares." Ann. Statist. 49 (5) 2755 - 2773, October 2021. https://doi.org/10.1214/21-AOS2054

Information

Received: 1 July 2020; Revised: 1 January 2021; Published: October 2021
First available in Project Euclid: 12 November 2021

MathSciNet: MR4338382
zbMATH: 1486.62205
Digital Object Identifier: 10.1214/21-AOS2054

Subjects:
Primary: 62J05
Secondary: 62J99

Keywords: analysis , compatibility , Lasso , minimax , Moore–Penrose pseudo inverse , Oracle inequality , projection , Total variation regularization

Rights: Copyright © 2021 Institute of Mathematical Statistics

JOURNAL ARTICLE
19 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.49 • No. 5 • October 2021
Back to Top