Video Lectures

Lecture 8: Norms of Vectors and Matrices

Description

A norm is a way to measure the size of a vector, a matrix, a tensor, or a function. Professor Strang reviews a variety of norms that are important to understand including S-norms, the nuclear norm, and the Frobenius norm.

Summary

The \(\ell^1\) and \(\ell^2\) and \(\ell^\infty\) norms of vectors
The unit ball of vectors with norm \(\leq\) 1
Matrix norm = largest growth factor = max \( \Vert Ax \Vert / \Vert x \Vert\)
Orthogonal matrices have \(\Vert Q \Vert_2 = 1\) and \(\Vert Q \Vert^2_F = n\)

Related section in textbook: I.11

Instructor: Prof. Gilbert Strang

Problems for Lecture 8
From textbook Section I.11

1. Show directly this fact about \(\ell^1\) and \(\ell^2\) and \(\ell^\infty\) vector norms : \(||\boldsymbol{v}||^2_2\leq ||\boldsymbol{v}||_1\;||\boldsymbol{v}||_\infty\)

7. A short proof of \(||AB||_F\leq||A||_F\,||B||_F\) starts from multiplying rows times columns :
\(|(AB)_{ij}|^2\leq||\text{row }i\text{ of }A||^2\,||\text{column } j\text{ of } B||^2\) is the Cauchy-Schwarz inequality.
Add up both sides over all \(i\) and \(j\) to show that \(||AB||^2_F\leq||A||^2_F\,||B||^2_F\)

Course Info

Departments
As Taught In
Spring 2018
Learning Resource Types
Lecture Videos
Problem Sets
Instructor Insights