Video Lectures

Lecture 19: Saddle Points Continued, Maxmin Principle

Description

Professor Strang continues his discussion of saddle points, which are critical for deep learning applications. Later in the lecture, he reviews the Maxmin Principle, a decision rule used in probability and statistics to optimize outcomes.

Summary

\(x’Sx/x’x\) has a saddle at eigenvalues between lowest / highest.
(Max over all \(k\)-dim spaces) of (Min of \(x’Sx/x’x\)) = evalue
Sample mean and expected mean
Sample variance and \(k\)th eigenvalue variance

Related sections in textbook: III.2 and V.1

Instructor: Prof. Gilbert Strang

Problems for Lecture 19
From textbook Sections III.2 and V.1

3. We know \(\frac{1}{3}\) of all integers are divisible by 3 and \(\frac{1}{7}\) of integers are divisible by 7. What fraction of integers will be divisible by 3 or 7 or both?

8. Equation (4) gave a second equivalent form for \(S^2\) (the variance using samples):

$$ \boldsymbol{S^2}=\frac{1}{N-1} \text{ sum of } (x_i-m)^2=\frac{1}{N-1}\left[\left(\text{sum of } x^2_i\right)-Nm^2\right]. $$

Verify the matching identity for the expected variance \(\sigma^2\) (using \(m=\Sigma\, p_i\, x_i\)):

$$ \boldsymbol{\sigma^2}= \textbf{ sum of } \boldsymbol{p_i\left(x_i-m\right)^2}=\left(\textbf{sum of } \boldsymbol{p_i\,x_i^2}\right)-\boldsymbol{m^2}. $$

Course Info

Departments
As Taught In
Spring 2018
Learning Resource Types
Lecture Videos
Problem Sets
Instructor Insights