Lecture 18 – Logistic Regression, Part 1
by Suraj Rampure (Summer 2020)
Make sure to complete the Quick Check questions in between each video. These are ungraded, but it’s in your best interest to do them.
Video | Quick Check | |
---|---|---|
18.1 Classification, and a brief overview of the machine learning taxonomy. |
18.1 | |
18.2 Pitfalls of using least squares to model probabilities. Creating a graph of averages to motivate the logistic regression model. |
18.2 | |
18.3 Deriving the logistic regression model from the assumption that the log-odds of the probability of belonging to class 1 is linear. |
18.3 | |
18.4 Formalizing the logistic regression model. Exploring properties of the logistic function. Interpreting the model coefficients. |
18.4 | |
18.5 Discussing the pitfalls of using squared loss with logistic regression. |
18.5 | |
18.6 Introducing cross-entropy loss, as a better alternative to squared loss for logistic regression. |
18.6 | |
18.7 Using maximum likelihood estimation to arrive at cross-entropy loss. |
18.7 | |
18.8 Demo of using scikit-learn to fit a logistic regression model. An overview of what's coming next. |
18.8 |