18.S096 | January IAP 2023 | Undergraduate

Matrix Calculus for Machine Learning and Beyond

Lecture 4 Part 2: Nonlinear Root Finding, Optimization, and Adjoint Gradient Methods

Description: Nonlinear root finding by Newton’s method and optimization by gradient descent. “Adjoint” methods (reverse-mode/backpropagation) let us find gradients efficiently for large-scale engineering optimization.

Instructors: Alan Edelman, Steven G. Johnson

Course Info

January IAP 2023
Editable Files
Instructor Insights
Lecture Notes
Lecture Videos
Problem Set Solutions
Problem Sets
Readings