18.S096 | January IAP 2023 | Undergraduate

Matrix Calculus for Machine Learning and Beyond

Lecture 4 Part 2: Nonlinear Root Finding, Optimization, and Adjoint Gradient Methods

Description: Nonlinear root finding by Newton’s method and optimization by gradient descent. “Adjoint” methods (reverse-mode/backpropagation) let us find gradients efficiently for large-scale engineering optimization.

Instructors: Alan Edelman, Steven G. Johnson

Course Info

Departments
As Taught In
January IAP 2023
Learning Resource Types
Lecture Videos
Lecture Notes
Problem Sets with Solutions
Readings
Instructor Insights