Semantic Scholar Open Access 2019 459 sitasi

SGD: General Analysis and Improved Rates

Robert Mansel Gower Nicolas Loizou Xun Qian Alibek Sailanbayev Egor Shulgin +1 lainnya

Abstrak

We propose a general yet simple theorem describing the convergence of SGD under the arbitrary sampling paradigm. Our theorem describes the convergence of an infinite array of variants of SGD, each of which is associated with a specific probability law governing the data selection rule used to form mini-batches. This is the first time such an analysis is performed, and most of our variants of SGD were never explicitly considered in the literature before. Our analysis relies on the recently introduced notion of expected smoothness and does not rely on a uniform bound on the variance of the stochastic gradients. By specializing our theorem to different mini-batching strategies, such as sampling with replacement and independent sampling, we derive exact expressions for the stepsize as a function of the mini-batch size. With this we can also determine the mini-batch size that optimizes the total complexity, and show explicitly that as the variance of the stochastic gradient evaluated at the minimum grows, so does the optimal mini-batch size. For zero variance, the optimal mini-batch size is one. Moreover, we prove insightful stepsize-switching rules which describe when one should switch from a constant to a decreasing stepsize regime.

Penulis (6)

R

Robert Mansel Gower

N

Nicolas Loizou

X

Xun Qian

A

Alibek Sailanbayev

E

Egor Shulgin

P

Peter Richtárik

Format Sitasi

Gower, R.M., Loizou, N., Qian, X., Sailanbayev, A., Shulgin, E., Richtárik, P. (2019). SGD: General Analysis and Improved Rates. https://www.semanticscholar.org/paper/b4f174539e0123fd6bfdc1c6e97b353472a10eff

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2019
Bahasa
en
Total Sitasi
459×
Sumber Database
Semantic Scholar
Akses
Open Access ✓