Lecture 13 – Simple Linear Regression
Presented by Anthony D. Joseph and Suraj Rampure
Content by Suraj Rampure and Ani Adhikari
A reminder – the right column of the table below contains Quick Checks. These are not required but suggested to help you check your understanding.
Video | Quick Check | |||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Video | Quick Check | |
---|---|---|
13.0 Introduction and recap of the modeling process. |
13.0 | |
13.1 The correlation coefficient and its properties. |
13.1 | |
13.2 Defining the simple linear regression model, our first model with two parameters and an input variable. Motivating linear regression with the graph of averages. |
13.2 | |
13.3 Using calculus to derive the optimal model parameters for the simple linear regression model, when we choose squared loss as our loss function. |
13.3 | |
13.4 Visualizing and interpreting loss surface of the SLR model. |
13.4 | |
13.5 Interpreting the slope of the simple linear model. |
13.5 | |
13.6 Defining key terminology in the regression context. Expanding the simple linear model to include any number of features. |
13.6 | |
13.7 RMSE as a metric of accuracy. Multiple R-squared as a metric of explained variation. Summary. |
13.7 |