Data Science Seminar: Using Bagged Posteriors for Robust Model-based Inference08 Oct 2019, by Sponsored events in
23 October 2019
Fry Building, University of Bristol, Bristol, UK
By: Jonathan Huggins
Department of Biostatistics, Harvard University
The Jean Golding Institute has teamed up with the Heilbronn Institute for Mathematical Research to showcase the latest research in Data Science – methodology with roots in Mathematics and Computer Science with important applied implications.
13:00: Part 1 – Broad interest Data Science showcase (Using bagged posteriors for robust model-based inference)
14:00: Coffee and cake in the Common Room
14:30: Part 2 – Statistics seminar (Using bagged posteriors for robust inference and model criticism)
Abstract 1: Using bagged posteriors for robust model-based inference
Standard Bayesian inference is known to be sensitive to misspecification between the model and the data-generating mechanism, leading to unreliable uncertainty quantification and poor predictive performance. However, finding generally applicable and computationally feasible methods for robust Bayesian inference under misspecification has proven to be a difficult challenge. An intriguing approach is to use bagging on the Bayesian posterior (“BayesBag”); that is, to use the average of posterior distributions conditioned on bootstrapped datasets. In this talk, I describe the statistical behavior of BayesBag, propose a model-data mismatch index for diagnosing model misspecification using BayesBag, and empirically validate our BayesBag methodology on synthetic and real-world data. We find that in the presence of significant misspecification, BayesBag yields more reproducible inferences, has better predictive accuracy, and selects correct models more often than the standard Bayesian posterior; meanwhile, when the model is correctly specified, BayesBag produces superior or equally good results for parameter inference and prediction, while being slightly more conservative for model selection. Overall, our results demonstrate that BayesBag combines the attractive modeling features of standard Bayesian inference with the distributional robustness properties of frequentist methods.
Abstract 2: Using bagged posteriors for robust inference and model criticism
In the second talk of the seminar, Jonathan Huggins will present a deeper dive into the asymptotic theory of BayesBag, looking in more detail at model criticism. This talk will be a more technical exploration of the concepts outlined in the general talk before the break.