Seminars
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
Calendar
Search
Add a seminar

RSS
Forthcoming seminars




Seminar on Probability Theory and Mathematical Statistics
October 11, 2024 18:00–20:00, St. Petersburg, PDMI, room 311 (nab. r. Fontanki, 27)
 


Nonasymptotic Analysis of Stochastic Gradient Descent with the Richardson�Romberg Extrapolation

S. V. Samsonov

Number of views:
This page:103

Abstract: We address the problem of solving strongly convex and smooth minimization problems using stochastic gradient descent (SGD) algorithm with a constant step size. Previous works suggested to combine the Polyak-Ruppert averaging procedure with the Richardson-Romberg extrapolation technique to reduce the asymptotic bias of SGD at the expense of a mild increase of the variance. We significantly extend previous results by providing an expansion of the mean-squared error of the resulting estimator with respect to the number of iterations $n$. More precisely, we show that the mean-squared error can be decomposed into the sum of two terms: a leading one of order $\mathcal{O}(n^{-1/2})$ with explicit dependence on a minimax-optimal asymptotic covariance matrix, and a second-order term of order $\mathcal{O}(n^{-3/4})$. We also extend this result to the $p$-th moment bound keeping same scaling of the remainders with respect to $n$. Our analysis relies on the properties of the SGD iterates viewed as a time-homogeneous Markov chain. In particular, we establish that this chain is geometrically ergodic with respect to a suitably defined weighted Wasserstein semimetric.
Based on the joint work https://arxiv.org/abs/2410.05106