# Lecture 19 – Logistic Regression, Part 1

Presented by Fernando Perez, Suraj Rampure

Content by Suraj Rampure, Josh Hug, Joseph Gonzalez, Ani Adhikari

A reminder – the right column of the table below contains *Quick Checks*. These are **not** required but suggested to help you check your understanding.

Video | Quick Check | |
---|---|---|

19.1 Classification, and a brief overview of the machine learning taxonomy. |
19.1 | |

19.2 Pitfalls of using least squares to model probabilities. Creating a graph of averages to motivate the logistic regression model. |
19.2 | |

19.3 Deriving the logistic regression model from the assumption that the log-odds of the probability of belonging to class 1 is linear. |
19.3 | |

19.4 Formalizing the logistic regression model. Exploring properties of the logistic function. Interpreting the model coefficients. |
19.4 | |

19.5 Discussing the pitfalls of using squared loss with logistic regression. |
19.5 | |

19.6 Introducing cross-entropy loss, as a better alternative to squared loss for logistic regression. |
19.6 | |

19.7 Using maximum likelihood estimation to arrive at cross-entropy loss. |
19.7 | |

19.8 Demo of using scikit-learn to fit a logistic regression model. An overview of what's coming next. |
19.8 |