Aims
To introduce students to advanced stochastic simulation methods such as Markov-chain Monte Carlo in a Bayesian context; to illustrate the practical issues of application of such methods, with real data examples; to discuss Bayesian approaches to model selection, model criticism and model mixing
Intended Learning Outcomes
At the end of this course, a student should be able to:
Illustrate the use of Monte Carlo methods, including importance sampling;
Explain the operation and basic theory of the two main Markov-Chain Monte-Carlo methods, Gibbs sampling and the Metropolis-Hastings algorithm;
Derive the full conditional distributions for parameters in simple low-dimensional problems;
Implement Gibbs samplers and Metropolis-Hastings chains in R;
Apply diagnostic procedures to check convergence and mixing of MCMC methods
Describe Bayesian approaches to model selection;
Calculate Bayes’ factors for simple model comparisons;
Explain MCMC approaches to model selection and model mixing;
Describe posterior predictive checks as a means of model criticism.
Tentative
Syllabus
Review of basic Monte-Carlo methods
Importance sampling and other variance reduction techniques, rejection sampling
Markov-chain Monte-Carlo methods
Gibbs sampling, including derivation of full conditionals and choice of blocking
Metropolis-Hastings algorithm, design issues and special cases
More advanced techniques (e.g. the slice sampler and population MC), as time permits
Practical issues in MCMC
Design, convergence, mixing, estimating the variance of MCMC estimates
Marginal likelihood
Definition, approximate computation, MCMC methods, including reversible jump MCMC
Model selection
Bayes factors, posterior odds, BIC, DIC
Model criticism
Posterior predictive checks, other methods as time permits
One or two case studies
Assessment
Honours: 100% exam
Masters: 85% exam, 15% for summary of read paper(s)