# Lecture 15 – Bias and Variance

by Ani Adhikari (Spring 2020)

**Important:** This lecture is taken from Spring 2020 (which again is why the video titles don’t match up with our numbering).

- In order to follow it, you must be familiar with the ideas from Lecture 3 (Random Variables).
- In the last lecture, we touched on this idea of “model complexity”. It is mentioned towards the end of this lecture, but will be covered more in-depth in Lecture 16 (Cross-Validation and Regularization).
- The algebra behind the decomposition of model risk into observational variance, model variance, and bias, is not in the slides or video but is in the link above. You should read it
**after**watching this lecture.

Video | Quick Check | |
---|---|---|

15.1 Introducing the data generating process and prediction error. Model risk. |
15.1 | |

15.2 Looking at different sources of error in our model – observation variance, model variance, and bias – and discussing how to mitigate them. |
15.2 | |

15.3 Decomposing model risk into the sum of observation variance, model variance, and the square of bias. |
15.3 |