December 2021 Analysis of generalized Bregman surrogate algorithms for nonsmooth nonconvex statistical learning
Yiyuan She, Zhifeng Wang, Jiuwu Jin
Author Affiliations +
Ann. Statist. 49(6): 3434-3459 (December 2021). DOI: 10.1214/21-AOS2090

Abstract

Modern statistical applications often involve minimizing an objective function that may be nonsmooth and/or nonconvex. This paper focuses on a broad Bregman-surrogate algorithm framework including the local linear approximation, mirror descent, iterative thresholding, DC programming and many others as particular instances. The recharacterization via generalized Bregman functions enables us to construct suitable error measures and establish global convergence rates for nonconvex and nonsmooth objectives in possibly high dimensions. For sparse learning problems with a composite objective, under some regularity conditions, the obtained estimators as the surrogate’s fixed points, though not necessarily local minimizers, enjoy provable statistical guarantees, and the sequence of iterates can be shown to approach the statistical truth within the desired accuracy geometrically fast. The paper also studies how to design adaptive momentum based accelerations without assuming convexity or smoothness by carefully controlling stepsize and relaxation parameters.

Funding Statement

The work is partially supported by the National Science Foundation.

Acknowledgments

The authors would like to thank the Editor, Associate Editor and referees for suggestions that significantly improved the paper.

Funding Statement

The work is partially supported by the National Science Foundation.

Acknowledgments

The authors would like to thank the Editor, Associate Editor and referees for suggestions that significantly improved the paper.

Citation

Download Citation

Yiyuan She. Zhifeng Wang. Jiuwu Jin. "Analysis of generalized Bregman surrogate algorithms for nonsmooth nonconvex statistical learning." Ann. Statist. 49 (6) 3434 - 3459, December 2021. https://doi.org/10.1214/21-AOS2090

Information

Received: 1 February 2021; Revised: 1 May 2021; Published: December 2021
First available in Project Euclid: 14 December 2021

MathSciNet: MR4352536
zbMATH: 1486.62212
Digital Object Identifier: 10.1214/21-AOS2090

Subjects:
Primary: 49J52 , 68Q25 , 90C26

Keywords: Bregman divergence , MM algorithms , momentum-based acceleration , nonconvex optimization , nonsmooth optimization , statistical algorithmic analysis

Rights: Copyright © 2021 Institute of Mathematical Statistics

JOURNAL ARTICLE
26 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.49 • No. 6 • December 2021
Back to Top