# Lecture 13 – Ordinary Least Squares

Presented by Anthony D. Joseph and Suraj Rampure

Content by Suraj Rampure, Ani Adhikari, Deb Nolan, Joseph Gonzalez

The Quick Check for this lecture is due **Monday, October 12th at 11:59PM**. In order to get the Gradescope submission code, you will have to follow the instructions at the end of one of these Google Forms; the instructions for this lecture are more involved as we will have you access the exam platform that we are using for the Midterm exam next week.

Video | Quick Check | |
---|---|---|

13.1 A quick recap of the modeling process, and a roadmap for lecture. |
13.1 | |

13.2 Defining the multiple linear regression model using linear algebra (dot products and matrix multiplication). Introducing the idea of a design matrix. |
13.2 | |

13.3 Defining the mean squared error of the multiple linear regression model as the (scaled) norm of the residual vector. |
13.3 | |

13.4 Using a geometric argument to determine the optimal model parameter. |
13.4 | |

13.5 Residual plots. Properties of residuals, with and without an intercept term in our model. |
13.5 | |

13.6 Discussing the conditions in which there isn't a unique solution for the optimal model parameter. A summary, and outline of what is to come. |
13.6 |