18.409 | Spring 2015 | Graduate

Algorithmic Aspects of Machine Learning

Discussion

Nonnegative Matrix Factorization

Discussion: When does well-posedness lead to better algorithms?

Balcan, M., A. Blum, et al. “Clustering under Approximation Stability.” (PDF) Journal of the ACM (2013).

Tensor Decompositions

Discussion: When do algorithms rely (too much) on a distributional model?

Feige, U., and J. Kilian. “Heuristics for Semirandom Graph Problems.” Journal of Computing and System Sciences 63, no. 4 (2001): 639–71.

Sparse Coding

Discussion: When does belief propagation (provably) work?

Geman, S., and D. Geman. “Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images.” Pattern Analysis and Machine Intelligence (1984).

Learning Mixture Models

Discussion: Is nature an adversary? And if not, how can we model and exploit that?

Bhaskara, A., M. Charikar, et al. “Smoothed Analysis of Tensor Decompositions.” Symposium on Theory of Computing (2014).

Linear Inverse Problems

Discussion: Do we have enough average-case assumptions?

Berthet, Q., and P. Rigollet. “Computational Lower Bounds for Sparse PCA.” Conference on Learning Theory (2013).

Chandrasekaran, V., and M. Jordan. “Computational and Statistical Tradeoffs via Convex Relaxation.” Proceedings of the National Academy of Sciences of the United States of America 110, no. 13 (2013): E1181–90.

Course Info

Instructor
Departments
As Taught In
Spring 2015
Level
Learning Resource Types
Problem Sets
Lecture Notes
Activity Assignments
Written Assignments