Lecture 22 – Dimensionality Reduction

by Josh Hug (Fall 2019)

Important: This lecture is a combination of two lectures from the Fall 2019 semester.

  • There are a couple of small typos in 20.4. To check whether a set of vectors is an orthonormal set, we should check whether V^T @ V is the identity matrix (not V @ V^T). For matrices whose columns form an orthonormal set, the property that the matrix’s transpose is equivalent to its inverse only holds true if the matrix is square.
  • There is a set of extra slides at the end of the lecture slides. These slides contain a review of concepts in linear algebra such as matrix multiplication and rank.
Video Quick Check
22.1
Dimensionality. Visualizing high-dimensional data.
22.1
22.2
More visualizations of high-dimensional data.
22.2
22.3
Matrix decomposition, redundancy, and rank. Introduction to the singular value decomposition (SVD).
22.3
22.4
The theory behind the singular value decomposition. Orthogonality and orthonormality.
22.4
22.5
Low rank approximations with the singular value decomposition.
22.5