Data Science Seminar: A Generalization Bound for Online Variational Inference11 Oct 2019, by Sponsored events in
5 December 2019, 12.00 PM – 1.00 PM
Room G.09, Fry Building, Woodland Road, University of Bristol, BS8 1UG
A talk by Pierre Alquier, Research Scientist, Riken AIP, Tokyo, Japan
Bayesian inference provides an attractive online-learning framework to analyse sequential data, and offers generalisation guarantees which hold even under model mismatch and with adversaries. Unfortunately, exact Bayesian inference is rarely feasible in practice and approximation methods are usually employed, but do such methods preserve the generalization properties of Bayesian inference?
In this talk, we show that this is indeed the case for some variational inference (VI) algorithms. We propose new online, tempered VI algorithms and derive their generalization bounds. Our theoretical result relies on the convexity of the variational objective, but we argue that our result should hold more generally and present empirical evidence in support of this. Our work in this paper presents theoretical justifications in favor of online algorithms that rely on approximate Bayesian methods.
***Please note the amended date, time and location for this seminar***
Bristol Data Science Seminars
The Jean Golding Institute has teamed up with the Heilbronn Institute for Mathematical Research to showcase the latest research in Data Science – methodology with roots in Mathematics and Computer Science with important applied implications.
The series will feature a range of internationally regarded high-profile speakers on topics that will be relevant to a broad audience.
Please take a look at Bristol Data Science Seminars for more information about the series.