Lecture 18 – Logistic Regression, Part 1
Presented by Fernando Perez, Suraj Rampure
Content by Suraj Rampure, Josh Hug, Joseph Gonzalez, Ani Adhikari
The Quick Check for this lecture is due Monday, November 9th at 11:59PM. A random one of the following Google Forms will give you an alphanumeric code once you submit; you should take this code and enter it into the “Lecture 18” question in the “Quick Check Codes” assignment on Gradescope to get credit for submitting this Quick Check.
Video | Quick Check | |
---|---|---|
18.1 Classification, and a brief overview of the machine learning taxonomy. |
18.1 | |
18.2 Pitfalls of using least squares to model probabilities. Creating a graph of averages to motivate the logistic regression model. |
18.2 | |
18.3 Deriving the logistic regression model from the assumption that the log-odds of the probability of belonging to class 1 is linear. |
18.3 | |
18.4 Formalizing the logistic regression model. Exploring properties of the logistic function. Interpreting the model coefficients. |
18.4 | |
18.5 Discussing the pitfalls of using squared loss with logistic regression. |
18.5 | |
18.6 Introducing cross-entropy loss, as a better alternative to squared loss for logistic regression. |
18.6 | |
18.7 Using maximum likelihood estimation to arrive at cross-entropy loss. |
18.7 | |
18.8 Demo of using scikit-learn to fit a logistic regression model. An overview of what's coming next. |
18.8 |