Lecture 17 – Gradient Descent

by Josh Hug (Fall 2019) and Joseph Gonzalez (Spring 2020)

Important: This lecture is taken from both Fall 2019 and Spring 2020.

  • In order to follow the lecture, you should be familiar with the ideas from Discussion 1 Problem 2 (Calculus).
  • The reference to Homework 6 Problem 7 in 17.2 should be a reference to Homework 5 Problem 3.
  • In Homework 7, you will get more practice with learning rates and gradient descent.
  • There is an updated version of the Loss Game mentioned in 17.3.
Video Quick Check
17.1
Gradient descent in one dimension. Convexity.
17.1
17.2
Various methods of optimizing loss functions in one dimension.
17.2
17.3
Gradient descent in multiple dimensions. Interpretation of gradients.
17.3
17.4
Stochastic gradient descent (SGD). Comparison between gradient descent and SGD.
17.4