Please use this identifier to cite or link to this item: http://theses.ncl.ac.uk/jspui/handle/10443/3700
Full metadata record
DC FieldValueLanguage
dc.contributor.authorNewman, Keith-
dc.date.accessioned2017-11-21T15:20:31Z-
dc.date.available2017-11-21T15:20:31Z-
dc.date.issued2017-
dc.identifier.urihttp://hdl.handle.net/10443/3700-
dc.descriptionPhD Thesisen_US
dc.description.abstractLatent Gaussian models are popular and versatile models for performing Bayesian inference. In many cases, these models will be analytically intractable creating a need for alternative inference methods. Integrated nested Laplace approximations (INLA) provides fast, deterministic inference of approximate posterior densities by exploiting sparsity in the latent structure of the model. Markov chain Monte Carlo (MCMC) is often used for Bayesian inference by sampling from a target posterior distribution. This suffers poor mixing when many variables are correlated, but careful reparameterisation or use of blocking methods can mitigate these issues. Blocking comes with additional computational overheads due to the matrix algebra involved; these costs can be limited by harnessing the same latent Markov structures and sparse precision matrix properties utilised by INLA, with particular attention paid to efficient matrix operations. We discuss how linear and latent Gaussian models can be constructed by combining methods for linear Gaussian models with Gaussian approximations. We then apply these ideas to a case study in detecting genetic epistasis between telomere defects and deletion of non-essential genes in Saccharomyces cerevisiae, for an experiment known as Quantitative Fitness Analysis (QFA). Bayesian variable selection is included to identify which gene deletions cause a genetic interaction. Previous Bayesian models have proven successful in detecting interactions but time-consuming due to the complexity of the model and poor mixing. Linear and latent Gaussian models are created to pursue more efficient inference over standard Gibbs samplers, but we find inference methods for latent Gaussian models can struggle with increasing dimension. We also investigate how the introduction of variable selection provides opportunities to reduce the dimension of the latent model structure for potentially faster inference. Finally, we discuss progress on a new follow-on experiment, Mini QFA, which attempts to find epistasis between telomere defects and a pair of gene deletions.en_US
dc.language.isoesen_US
dc.publisherNewcastle Universityen_US
dc.titleBayesian modelling of latent Gaussian models featuring variable selectionen_US
dc.typeThesisen_US
Appears in Collections:School of Mathematics and Statistics

Files in This Item:
File Description SizeFormat 
Newman, K 2017.pdfThesis12.28 MBAdobe PDFView/Open
dspacelicence.pdfLicence43.82 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.