- Department: Mathematics
- Module co-ordinator: Information currently unavailable
- Credit value: 20 credits
- Credit level: I
- Academic year of delivery: 2022-23
This module is the first part of the Probability & Statistics stream, and as such must be taken with the second part (Statistical Inference and Linear Models)
Pre-requisite modules:
Post-requisite modules:
Occurrence | Teaching period |
---|---|
A | Autumn Term 2022-23 |
This module will give students a theoretical and mathematically formal framework for understanding the foundations of data science. Students will learn how to work with multiple random variables in a variety of settings: joint and conditional distributions will be developed, along with estimators and convergence theorems, and Markov chains will be introduced to deal with random variables indexed by discrete time. Further familiarity with the statistical software R will be developed throughout.
Understand the concepts of joint and conditional distributions. Be able to compute conditional expectations.
Be able to estimate parameters of standard distributions following the maximum likelihood approach.
Understand estimators as functions of random variables and be able to assess their properties such as unbiasedness, consistency and asymptotic normality.
Understand the role and use of generating functions, be able to use them to compute the expectation and variance of standard distributions, and to identify the distribution such as that of a sum of independent random variables.
Understand the Weak Law of Large Numbers and the Central Limit Theorem.
Understand the limit of estimation through the Cramer-Rao lower bound.
Describe and calculate with discrete time/space Markov chains, including the calculation of absorption probabilities and stationary distributions.
Joint and conditional distributions (covering continuous distributions, in particular the Multivariate Normal)
Laws of total expectation and variance
Maximum likelihood estimation
Further properties of estimators (e.g., precision measure (e.g., MSE), Cramer-Rao; include bootstrap method)
Generating functions (MGF and PGF)
Modes of convergence, leading to proof of WLLN and CLT
Markov chains, up to convergence to equilibrium and ergodic theorem
Brief introduction to MCMC (time permitting)
Task | Length | % of module mark |
---|---|---|
Closed/in-person Exam (Centrally scheduled) Probability & Markov Chains |
N/A | 100 |
None
There will be five formative assignments with marked work returned in the seminars. At least one of them will contain a longer written part, done in LaTeX.
Task | Length | % of module mark |
---|---|---|
Closed/in-person Exam (Centrally scheduled) Probability & Markov Chains |
N/A | 100 |
Current Department policy on feedback is available in the student handbook. Coursework and examinations will be marked and returned in accordance with this policy
M DeGroot and M Schervish (2012), Probability and Statistics (4th edition), Pearson