Bristol Data Science Seminar08 Oct 2019, by Sponsored events in
23 October 2019, 1.00- 3.30 PM
G.09, Fry Building, University of Bristol, Bristol, BS8 1TH
‘Using bagged posteriors for robust model-based inference’ – a two-part talk by Jonathan Huggins, Harvard University.
Introducing a new seminar series in Data Science for 2019-2020.
The Jean Golding Institute has teamed up with the Heilbronn Institute for Mathematical Research to showcase the latest research in Data Science – methodology with roots in Mathematics and Computer Science with important applied implications.
To kick off the first event of the Bristol Data Science Seminars, Jonathan Huggins (Department of Biostatistics, Harvard University) will be delivering a two-part talk on ‘Using bagged posteriors for robust model-based inference.’
Coffee and cake will be available during the break.
1pm – 2pm: Broad interest Data Science showcase – Jonathan Huggins, ‘Using bagged posteriors for robust model-based inference’
2pm – 2.30pm: Coffee and cake in the Common Room
2.30pm – 3.30pm: Statistics seminar – Jonathan Huggins, ‘Using bagged posteriors for robust inference and model criticism’
Broad interest Data Science showcase: ‘Using bagged posteriors for robust model-based inference’
Standard Bayesian inference is known to be sensitive to misspecification between the model and the data-generating mechanism, leading to unreliable uncertainty quantification and poor predictive performance. However, finding generally applicable and computationally feasible methods for robust Bayesian inference under misspecification has proven to be a difficult challenge. An intriguing approach is to use bagging on the Bayesian posterior (“BayesBag”); that is, to use the average of posterior distributions conditioned on bootstrapped datasets.
In this talk, I describe the statistical behavior of BayesBag, propose a model-data mismatch index for diagnosing model misspecification using BayesBag, and empirically validate our BayesBag methodology on synthetic and real-world data. We find that in the presence of significant misspecification, BayesBag yields more reproducible inferences, has better predictive accuracy, and selects correct models more often than the standard Bayesian posterior; meanwhile, when the model is correctly specified, BayesBag produces superior or equally good results for parameter inference and prediction, while being slightly more conservative for model selection. Overall, our results demonstrate that BayesBag combines the attractive modeling features of standard Bayesian inference with the distributional robustness properties of frequentist methods.
Statistics seminar: ‘Using bagged posteriors for robust inference and model criticism
In the second talk of the seminar, Jonathan Huggins will present a deeper dive into the asymptotic theory of BayesBag, looking in more detail at model criticism. This talk will be a more technical exploration of the concepts outlined in the general talk before the break.