Lecture 18 – Logistic Regression, Part 1
Presented by Fernando Perez, Suraj Rampure
Content by Suraj Rampure, Josh Hug, Joseph Gonzalez, Ani Adhikari
The Quick Check for this lecture is due Monday, November 9th at 11:59PM. A random one of the following Google Forms will give you an alphanumeric code once you submit; you should take this code and enter it into the “Lecture 18” question in the “Quick Check Codes” assignment on Gradescope to get credit for submitting this Quick Check.
Classification, and a brief overview of the machine learning taxonomy.
Pitfalls of using least squares to model probabilities. Creating a graph of averages to motivate the logistic regression model.
Deriving the logistic regression model from the assumption that the log-odds of the probability of belonging to class 1 is linear.
Formalizing the logistic regression model. Exploring properties of the logistic function. Interpreting the model coefficients.
Discussing the pitfalls of using squared loss with logistic regression.
Introducing cross-entropy loss, as a better alternative to squared loss for logistic regression.
Using maximum likelihood estimation to arrive at cross-entropy loss.
Demo of using scikit-learn to fit a logistic regression model. An overview of what's coming next.