Stochastic Optimization with Heavy-Tailed Noise via Accelerated Gradient Clipping
Abstract
In this paper, we propose a new accelerated stochastic first-order method called clipped-SSTM for smooth convex stochastic optimization with heavy-tailed distributed noise in stochastic gradients and derive the first high-probability complexity bounds for this method closing the gap in the theory of stochastic optimization with heavy-tailed noise. Our method is based on a special variant of accelerated Stochastic Gradient Descent (SGD) and clipping of stochastic gradients. We extend our method to the strongly convex case and prove new complexity bounds that outperform state-of-the-art results in this case. Finally, we extend our proof technique and derive the first non-trivial high-probability complexity bounds for SGD with clipping without light-tails assumption on the noise.
- Publication:
-
arXiv e-prints
- Pub Date:
- May 2020
- DOI:
- arXiv:
- arXiv:2005.10785
- Bibcode:
- 2020arXiv200510785G
- Keywords:
-
- Mathematics - Optimization and Control;
- Computer Science - Machine Learning
- E-Print:
- NeurIPS 2020