Lecture 18 – Gradient Descent
Presented by Anthony D. Joseph
Content by Josh Hug, Joseph Gonzalez, Paul Shao
A reminder – the right column of the table below contains Quick Checks. These are not required but suggested to help you check your understanding.
Gradient descent in one dimension. Convexity.
Various methods of optimizing loss functions in one dimension.
Gradient descent in multiple dimensions. Interpretation of gradients.
Stochastic gradient descent (SGD). Comparison between gradient descent and SGD.