# Lecture 12 – Simple Linear Regression

Presented by Suraj Rampure

Content by Suraj Rampure and Ani Adhikari

A reminder – the right column of the table below contains *Quick Checks*. These are **not** required but suggested to help you check your understanding.

Video | Quick Check | |
---|---|---|

12.0 Introduction |
||

12.1 The correlation coefficient and its properties. |
12.1 | |

12.2 Defining the simple linear regression model. Motivating linear regression with the graph of averages. |
12.2 | |

12.3 Using calculus to derive the optimal model parameters for the simple linear regression model, when we choose squared loss as our loss function. |
12.3 | |

12.4 Visualizing and interpreting loss surface of the SLR model. |
12.4 | |

12.5 Interpreting the slope of the simple linear model. |
12.5 | |

12.6 Defining key terminology in the regression context. Expanding the simple linear model to include any number of features. |
12.6 | |

12.7 RMSE as a metric of accuracy. Multiple R-squared as a metric of explained variation. Summary. |
12.7 |