Lecture 18 – Logistic Regression, Part 1
by Suraj Rampure (Summer 2020)
Make sure to complete the Quick Check questions in between each video. These are ungraded, but it’s in your best interest to do them.
Classification, and a brief overview of the machine learning taxonomy.
Pitfalls of using least squares to model probabilities. Creating a graph of averages to motivate the logistic regression model.
Deriving the logistic regression model from the assumption that the log-odds of the probability of belonging to class 1 is linear.
Formalizing the logistic regression model. Exploring properties of the logistic function. Interpreting the model coefficients.
Discussing the pitfalls of using squared loss with logistic regression.
Introducing cross-entropy loss, as a better alternative to squared loss for logistic regression.
Using maximum likelihood estimation to arrive at cross-entropy loss.
Demo of using scikit-learn to fit a logistic regression model. An overview of what's coming next.