Bayesian Statistics (III/IV) (0530001) 10 credits
(46 - Dr
Peter Lee, (no office), tel 01904 654200, e-mail pml1@york.ac.uk)
This is the version for the academic year beginning on 1 September of
the year 2010.
Aims: To introduce the basic notions of Bayesian statistics, showing
how Bayes Theorem provides a natural way of combining prior information with
experimental data to arrive at a posterior probability distribution over
parameters and to show the differences between classical (sampling theory)
statistics and Bayesian statistics.
Learning objectives: By the end of the module the student should be
able:
- To prove and use Bayes Theorem in its various forms;
- To carry out an analysis of normally distributed data with a normal prior
distribution;
- To be able to carry out a significance test of a point (or sharp) null
hypothesis using a full Bayesian methodology;
Prerequisites: Statistical Theory I (0590015).
Syllabus: The module will follow closely on parts of Chapters 1 to 4
of the main recommended textbook listed below.
- Preliminaries
- Probability and Bayes’ Theorem
- Examples on Bayes’ Theorem
- Random variables
- Several random variables
- Means and variances (Conditional Means and Variances omitted)
- Exercises on Chapter 1
- Bayesian Inference for the Normal Distribution
- Nature of Bayesian inference
- Normal prior and likelihood
- Several normal observations with a normal prior
- Dominant likelihoods
- Locally uniform priors
- Highest density regions (HDRs)
- Normal variance
- HDRs for the normal variance
- The role of sufficiency
- Conjugate prior distributions
(Mixtures of conjugate densities omitted)
- The exponential family (Omitted)
- Normal mean and variance both unknown
- Conjugate joint prior for the normal distribution
- Exercises on Chapter 2
- Some Other Common Distributions
- The binomial distribution
- Reference prior for the binomial likelihood
- Jeffreys’ rule
(Multidimensional case omitted)
- The Poisson distribution
- The uniform distribution (Omitted)
- Reference prior for the uniform distribution (Omitted)
- The tramcar problem (Omitted)
- The first digit problem; invariant priors (Omitted)
- The circular normal distribution (Omitted)
- Approximations based on the likelihood
(Extension to more than one parameter omitted)
- Reference posterior distributions (Omitted)
- Exercises on Chapter 3
- Hypothesis testing
- Hypothesis testing
- One-sided hypothesis tests
- Lindley’s method
- Point null hypotheses with prior information
(Case of nearly constant likelihood omitted)
- Point null hypotheses (normal case)
(Case of an unknown variance omitted)
- The Doogian philosophy (Omitted)
- Exercises on Chapter 4
- Two sample problems
(Omitted)
- Correlation, regression and analysis of variance
(Omitted)
- Other topics
(Omitted)
- Hierarchical models
(Omitted)
- The Gibbs sampler and other numerical methods
(Non-examinable)
- Introduction to Numerical Methods
- The EM Algorithm (Omitted apart from the subsection below)
- Data augmentation by Monte Carlo
- The Gibbs sampler
- Rejection sampling (Omitted)
- The Metropolis-Hastings algorithm (Omitted)
- Introduction to WINBUGS (Omitted)
- Generalized linear models (Omitted)
- Exercises on Chapter 9
Appendices
- Common statistical distributions
- Tables
- R programs
- Further reading
References
Index
Recommended texts:
*** P M Lee, Bayesian Statistics: An Introduction, Wiley (SF 4
LEE)
(For most of the material covered, either the second, the third
or the fourth edition will be satisfactory).
*** A Gelman, J B Carlin, H S Stern and D B Rubin, Bayesian Data
Analysis (2nd edn) Chapman & Hall (SF GEL)
*** H R Neave, Statistics Tables for mathematicians, engineers,
economists and the behavioural and managementsciences,London: Routledge
(SF 0.83 NEA and REF SF 0.83 NEA).
* G R Iverson, Bayesian Statistical
Inference, Beverley Hills, CA: Sage (SF4 IVE) (for preliminary
reading).
* S B McGrayne, The Theory That Would Not
Die, New Haven, CT: Yale University Press 2011 (for very interesting
background reading) (SF4 MCG).
* J Albert, Bayesian Computation with R,
Springer-Verlag (SF 4 ALB) (for computational aspects).
** D V Lindley, An Introduction to Probability and
Statistics from a Bayesian Viewpoint (2 vols - Part I: Probability
and Part II: Inference), Cambridge University Press (S 9 LIN).
Teaching and Support Teaching:
Spring Term
2 lectures per week
1 problems class per week
Assessment:
One and a half hour closed examination towards the end
of the Summer Term 90%
Coursework 10%
Elective Information: A course on Bayesian statistics showing how to
make inferences by combining prior beliefs with information obtained from
experimental data.