Colloquium | Wei Biao Wu27 Jan 2021, by Events in
Wednesday 24 March 2021, 4:00pm – 5:00pm
Fast Algorithms for Estimating Covariance Matrices of Stochastic Gradient Descent Solutions
Online, Zoom Webinar
We’re very excited to welcome our first Heilbronn Virtual Visiting Professor, Wei Biao Wu (University of Chicago), for an online colloquium on Wednesday 24 March.
For further information and to register for the colloquium, please visit the event website.
Abstract: Stochastic gradient descent (SGD), an important optimization method in machine learning, is widely used for parameter estimation especially in online setting where data comes in stream. While this recursive algorithm is popular for the computation and memory efficiency, it suffers from randomness of the solutions. In this talk we shall estimate the asymptotic covariance matrices of the averaged SGD iterates (ASGD) in a fully online fashion. Based on the recursive estimator and classic asymptotic normality results of ASGD, we can conduct online statistical inference of SGD estimators and construct asymptotically valid confidence intervals for model parameters. The algorithm for the recursive estimator is efficient and only uses SGD iterates: upon receiving new observations, we update the confidence intervals at the same time as updating the ASGD solutions without extra computational or memory cost. This approach fits in online setting even if the total number of data is unknown and takes the full advantage of SGD: computation and memory efficiency. This work is joint with Wanrong Zhu and Xi Chen.
Registration is now open, please click here to register.
For more information, please contact email@example.com
Join the Heilbronn Event mailing list to keep up to date with our upcoming events.