# Lecture 12 – Simple Linear Regression

Presented by Anthony D. Joseph and Suraj Rampure

Content by Suraj Rampure and Ani Adhikari

A random one of the following Google Forms will give you an alphanumeric code once you submit; you should take this code and enter it into the “Lecture 12” question in the “Quick Check Codes” assignment on Gradescope to get credit for submitting this Quick Check. You must submit this by **Monday, October 12th at 11:59PM** to get credit for it.

Video | Quick Check | |
---|---|---|

12.0 Introduction and recap of the modeling process. |
12.0 | |

12.1 The correlation coefficient and its properties. |
12.1 | |

12.2 Defining the simple linear regression model, our first model with two parameters and an input variable. Motivating linear regression with the graph of averages. |
12.2 | |

12.3 Using calculus to derive the optimal model parameters for the simple linear regression model, when we choose squared loss as our loss function. |
12.3 | |

12.4 Visualizing and interpreting loss surface of the SLR model. |
12.4 | |

12.5 Interpreting the slope of the simple linear model. |
12.5 | |

12.6 Defining key terminology in the regression context. Expanding the simple linear model to include any number of features. |
12.6 | |

12.7 RMSE as a metric of accuracy. Multiple R-squared as a metric of explained variation. Summary. |
12.7 |