Please use this identifier to cite or link to this item: http://theses.ncl.ac.uk/jspui/handle/10443/6115
Title: Bridging the Gap Between Modelling and Computation in Bayesian Statistics
Authors: Matsubara, Takuo
Issue Date: 2023
Publisher: Newcastle University
Abstract: Models that involve intractable normalising constants represent a major computational challenge to statistical inference, since the computation of intractable normalising constants requires numerical integration of complex functions over large or possibly infinite sets, which can be impractical. In particular, Bayesian inference for intractable models demands a specially tailored algorithm to bypass evaluation of two nested intractable normalising constants originating from posterior and model simultaneously. This thesis addresses this computational challenge through the development of a novel generalised Bayesian inference approach built on a Stein discrepancy, called SD-Bayes. Generalised Bayesian inference updates prior beliefs using a loss function, rather than a likelihood, and can therefore be used to confer desirable properties to resulting generalised posteriors, such as robustness to model misspecification. In this context, the Stein discrepancy selected as the loss function circumvents evaluation of normalising constants of models and produces generalised posteriors that are accessible using standard Markov chain Monte Carlo algorithms. On a theoretical level, we show posterior consistency, asymptotic normality, and global bias-robustness of generalised posteriors. It is shown that generalised posteriors equipped with global bias-robustness demonstrate a strong insensitivity to an irrelevant outlier mixed in data, that is, a simple yet common setting of model misspecification. For intractable models in continuous domains, we derive a useful special case of the Stein discrepancy, called kernel Stein discrepancy, to be combined with SD-Bayes. The resulting SD-Bayes demonstrates strong global bias-robustness and enables fully conjugate inference for exponential family models. We provide numerical experiments on a range of intractable distributions, including applications to kernel-based exponential family models and non-Gaussian graphical models. For intractable models in discrete domains, we establish another useful special case of the Stein discrepancy, called discrete Fisher divergence, to be combined with SD-Bayes. The resulting SD-Bayes benefits from its efficient computational cost and absence of user-specified hyperparameters that can be difficult to choose in the discrete case. In addition, a new approach to calibration of generalised posteriors through optimisation is considered, independently of SD-Bayes. Applications are presented on lattice models for discrete spatial data and on multivariate models for count data, where in each case the methodology facilitates generalised Bayesian inference at efficient computational cost.
Description: Ph. D. Thesis.
URI: http://hdl.handle.net/10443/6115
Appears in Collections:School of Mathematics, Statistics and Physics

Files in This Item:
File Description SizeFormat 
dspacelicence.pdfLicence43.82 kBAdobe PDFView/Open
Matsubara Takuo 190341374 ecopy.pdfThesis4.71 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.