Nothing Special   »   [go: up one dir, main page]

Open Access
June 2020 Bridging the gap between constant step size stochastic gradient descent and Markov chains
Aymeric Dieuleveut, Alain Durmus, Francis Bach
Ann. Statist. 48(3): 1348-1382 (June 2020). DOI: 10.1214/19-AOS1850

Abstract

We consider the minimization of a strongly convex objective function given access to unbiased estimates of its gradient through stochastic gradient descent (SGD) with constant step size. While the detailed analysis was only performed for quadratic functions, we provide an explicit asymptotic expansion of the moments of the averaged SGD iterates that outlines the dependence on initial conditions, the effect of noise and the step size, as well as the lack of convergence in the general (nonquadratic) case. For this analysis we bring tools from Markov chain theory into the analysis of stochastic gradient. We then show that Richardson–Romberg extrapolation may be used to get closer to the global optimum, and we show empirical improvements of the new extrapolation scheme.

Citation Download Citation

Aymeric Dieuleveut. Alain Durmus. Francis Bach. "Bridging the gap between constant step size stochastic gradient descent and Markov chains." Ann. Statist. 48 (3) 1348 - 1382, June 2020. https://doi.org/10.1214/19-AOS1850

Information

Received: 1 April 2018; Revised: 1 April 2019; Published: June 2020
First available in Project Euclid: 17 July 2020

zbMATH: 07241594
MathSciNet: MR4124326
Digital Object Identifier: 10.1214/19-AOS1850

Subjects:
Primary: 62L20
Secondary: 90C15 , 93E35

Keywords: Markov chains , Stochastic gradient descent

Rights: Copyright © 2020 Institute of Mathematical Statistics

Vol.48 • No. 3 • June 2020
Back to Top