Open Access
2024 High-dimensional limit of one-pass SGD on least squares
Elizabeth Collins–Woodfin, Elliot Paquette
Author Affiliations +
Electron. Commun. Probab. 29: 1-15 (2024). DOI: 10.1214/23-ECP571


We give a description of the high-dimensional limit of one-pass single-batch stochastic gradient descent (SGD) on a least squares problem. This limit is taken with non-vanishing step-size, and with proportionally related number of samples to problem-dimensionality. The limit is described in terms of a stochastic differential equation in high dimensions, which is shown to approximate the state evolution of SGD. As a corollary, the statistical risk is shown to be approximated by the solution of a convolution-type Volterra equation with vanishing errors as dimensionality tends to infinity. The sense of convergence is the weakest that shows that statistical risks of the two processes coincide. This is distinguished from existing analyses by the type of high-dimensional limit given as well as generality of the covariance structure of the samples.


Download Citation

Elizabeth Collins–Woodfin. Elliot Paquette. "High-dimensional limit of one-pass SGD on least squares." Electron. Commun. Probab. 29 1 - 15, 2024.


Received: 31 May 2023; Accepted: 15 December 2023; Published: 2024
First available in Project Euclid: 5 February 2024

arXiv: 2304.06847
Digital Object Identifier: 10.1214/23-ECP571

Primary: 60H30

Keywords: optimization , Random matrix theory , Stochastic differential equations , Stochastic gradient descent

Back to Top