Video Lectures

Lecture 21: Minimizing a Function Step by Step

Description

In this lecture, Professor Strang discusses optimization, the fundamental algorithm that goes into deep learning. Later in the lecture he reviews the structure of convolutional neural networks (CNN) used in analyzing visual imagery.

Summary

Three terms of a Taylor series of \(F\)(\(x\)) : many variables \(x\)
Downhill direction decided by first partial derivatives of \(F\) at \(x\)
Newton’s method uses higher derivatives (Hessian at higher cost).

Related sections in textbook: VI.1, VI.4

Instructor: Prof. Gilbert Strang

Problems for Lecture 21
From textbook Sections VI.1 and VI.4

1. When is the union of two circular discs a convex set? Or two squares?

5. Suppose \(K\) is convex and \(F(x)=1\) for \(x\) in \(K\) and \(F(x)=0\) for \(x\) not in \(K\). Is \(F\) a convex function? What if the 0 and 1 are reversed?

Course Info

Departments
As Taught In
Spring 2018
Learning Resource Types
Lecture Videos
Problem Sets
Instructor Insights