In Asymptotic Statistics we study the asymptotic behaviour of (aspects of) statistical procedures. Here “asymptotic” means that we study limiting behaviour as the number of observations tends to infinity. A first important reason for doing this is that in many cases it is very hard, if not impossible to derive for instance exact distributions of test statistics for fixed sample sizes. Asymptotic results are often easier to obtain. These can then be used to construct tests, or confidence regions that *approximately* have the correct uncertainty level. Similarly, determining estimators or other procedures that are optimal in a specific sense, for instance in the sense of minimal mean squared error or variance, is often not possible if the number of observations is fixed. Using asymptotic results is it however in many cases possible to exhibit procedures that are *asymptotically* optimal.

In this course we begin by treating the mathematical machinery from probability theory that is necessary to formulate and prove the statements of asymptotic statistics. Important are the various notions of stochastic convergence and their relations, the law of large numbers and the central limit theorem (which the students are assumed to know), the multivariate normal distribution, and the so-called delta method. We will use these tools to study the asymptotic behaviour of statistical procedures.

It is assumed that students have at least successfully completed introductory courses on probability theory and statistics and courses on linear algebra and multivariate calculus. It is highly recommended to follow a course on measure theoretic probability.

### Announcements

- Time/place final retake: Wednesday 19 February, 10:00-13:00, Science Park 904, room B0.201.
- See below for a detailed list of what you need to know for the retake.

### Course information

- Lecturer: Harry van Zanten
- TA: Paul Dobson
- 2 hours of lecture + 1 hour exercise class every week
- Lecture notes: Aad van der Vaart’s lecture notes: can be downloaded here. Additional chapters on minimax lower bounds and high-dimensional models: here.
- Recommended literature:
*Asymptotic Statistics*, by A.W. van der Vaart, Cambridge University Press. - Exams: midterm exam (40%) + final exam (60%). You need at least a 5.0 for the final exam in order to pass. Retake is a single exam (100%).

### Midterm exam

- Material covered: Chapters 1-3 of the lecture notes.
- You may skip the proofs of Theorem 1.8 and Lemma 1.9. You may completely skip Theorem 2.10.
- You need to know the exact formulation of all results and the ideas of all proofs (except the ones you may skip, listed above).
- You need to know the proof of the following results by heart: Lemma 2.3 (density of the multivariate normal), Theorem 2.7 (multivariate CLT), Theorem 3.1 (delta-method).
- You will not be allowed to use calculators, books, notes, laptops, etc.

### Final exam

- Material: everything we covered, as described in the table below.
- You need to know the exact formulation of all results and the ideas of all proofs (except the ones you may skip, see the table).
- You need to know the proof of the following results by heart: Theorem 4.5 (consistency of M-estimators), Theorem 4.11 (asymptotic normality of Z-estimators), Theorem 6.3.1 (minimax lower bound for smooth parametric models).
- You will not be allowed to use calculators, books, notes, laptops, etc.

### Retake

- The retake is a single exam about everything we covered, as described in the table below.
- You need to know the exact formulation of all results and the ideas of all proofs (except the ones you may skip, see the table).
- You need to know the proof of the following results by heart: Lemma 2.3 (density of the multivariate normal), Theorem 2.7 (multivariate CLT), Theorem 3.1 (delta-method), Theorem 4.5 (consistency of M-estimators), Theorem 4.11 (asymptotic normality of Z-estimators), Theorem 6.3.1 (minimax lower bound for smooth parametric models).
- You will not be allowed to use calculators, books, notes, laptops, etc.

### Old exams

### Slides

- Lecture 1
- Lecture 2
- Lecture 3
- Lecture 4
- Lecture 5
- Lecture 6
- Lecture 7
- Lecture 8
- Lecture 9
- Lecture 10
- Lecture 11

### What we have done so far

Lecture | Date | Topic | Material | Exercises | Remarks |

1 | 11/9 | introduction, convergence | slides+notes Sec. 1.1 up to and including Theorem 1.7 | 1.1, 1.2, 1.3(i), 1.4, 1.10, 1.15 | |

2 | 18/9 | convergence | rest Sec. 1.1, Sec. 1.2 | 1.11, 1.12, 1.17, 1.28, 1.29, 1.32 | 1) Skip proof of Prohorov and Helly. 2) There is an error in 1.28. Try to correct it! |

3 | 2/10 | multivariate normal | Sec. 2.1-2.4 | 2.1, 2.2, 2.5, 2.8, 2.13, 2.16, 2.17 | |

4 | 9/10 | chi square test and delta method | Sec. 2.5, 3.1 | 2.22, 2.23(i), 3.1, 3.2, 3.3 | Skip Theorem 2.10. |

5 | 16/10 | delta method | Sec. 3.2, 3.3 | 3.12, 3.18 | |

6 | 30/10 | M-estimators, consistency | Chap. 4 up to and including p. 45 + extra material in slides | 4.1, 4.5, 4.8, 4.11(i), 4.12(i, ii) | There is extra material about Glivenko-Cantelli theorems under bracketing in the slides |

7 | 6/11 | M-estimators, asymptotic normality, MLE | rest of Chap. 4 | 4.13, 4.14, 4.16, 4.18, 4.25 | |

8 | 20/11 | nonparametric estimation | Chapter 5 | 5.1, 5.2, 5.3, 5.6 | |

9 | 27/11 | minimax lower bounds 1 | sections 6.1, 6.2 | 6.1 | Material from the additional chapter, see above |

10 | 4/12 | minimax lower bounds 2 | rest of chapter 6 | 6.2-6.7 | |

11 | 11/12 | high-dimensional models | chapter 7 | 7.3-7.5, 7.7-7.9 | last lecture |