# Lecture 22 – Logistic Regression, Part 1

Presented by Presented by Fernando Perez, Suraj Rampure

Content by Suraj Rampure, Josh Hug, Joseph Gonzalez, Ani Adhikari

A reminder – the right column of the table below contains *Quick Checks*. These are **not** required but suggested to help you check your understanding.

Video | Quick Check | |
---|---|---|

22.0 Logistics |
||

22.1 Classification, and a brief overview of the machine learning taxonomy. |
22.1 | |

22.2 Pitfalls of using least squares to model probabilities. Creating a graph of averages to motivate the logistic regression model. |
22.2 | |

22.3 Deriving the logistic regression model from the assumption that the log-odds of the probability of belonging to class 1 is linear. |
22.3 | |

22.4 Formalizing the logistic regression model. Exploring properties of the logistic function. Interpreting the model coefficients. |
22.4 | |

22.5 Discussing the pitfalls of using squared loss with logistic regression. |
22.5 | |

22.6 Introducing cross-entropy loss, as a better alternative to squared loss for logistic regression. |
22.6 | |

22.7 Using maximum likelihood estimation to arrive at cross-entropy loss. |
22.7 | |

22.8 Demo of using scikit-learn to fit a logistic regression model. An overview of what's coming next. |
22.8 |