Project Instructions
The final project is to write a 4–6 page final paper. You can work in groups or you can work alone, and there are two options:
- You can write a literature review on some topic related to the material that we covered in class. In this class, we only focused on problems where there are provable guarantees so you should certainly choose a topic where there are provable guarantees. The goal of the writeup is to survey what is known, some main ideas of the proof in the papers you choose to read, and identify important open questions. You should think of this project as if you were going to give 1–2 lectures in the course, and how you would explain some topic beyond what we covered in class, to other students. We covered a lot of material in class, but there is still much more out there and I hope to make your literature reviews available on the course website so that they can be a useful reference for others. Also, you should choose a somewhat focused topic because if you choose a topic that is too broad, it will be difficult to get into the details.
- You can do original research. This is the more challenging type of project, but it is also the most open ended. You should choose some open question either explicitly mentioned in class, or connected to any of the topics we covered. You should try to give new provable guarantees for some problem, either by giving a new algorithm or giving an improved analysis of an existing algorithm. As with all research, it is never clear whether you will reach the goal that you set out. And so you should make sure that there is some partial progress that you can make along the way, that you can write up in your final paper. Even if you end up not proving what you set out to prove, you can still write up some preliminary ideas along the way or at the very least write up a literature review focused on the open question you worked on.
Suggested Projects
If you have ideas of your own of what you would like to work on, that is great! This is especially good because the purpose of the class was to expose you to some of the provable guarantees that are known for various problems in machine learning, and if you can and topics in your own research area where there are interesting avenues to explore, then you are more likely to use the material from this course in your own research even after the semester. That was one of my main goals in teaching this. If you do not have an idea, I am here to help. Here are some suggested topics, for writing a literature review on. If you are interested in one of these topics, email me to claim it and set up a meeting so that we can talk about it. If you reserve a topic by yourself, but are interested in working with another student, I will put a ‘?’ next to your name. And if we run out of topics, I will come up with more suggestions
-
Spectral Clustering
Awasthi, Pranjal, and Sheffet. “Improved Spectral-norm Bounds for Clustering.” (2012).
Kumar, Amit, and Ravindran Kannan. “Clustering with Spectral Norm and the k-Means Algorithm.” (2010).
-
Approximation Stability
Balcan, Marian-Florina, Avrium Blum, et al. “Approximate Clustering without the Approximation.” (PDF) ACM-SIAM Symposium on Discrete Algorithms (2009).
Awasthi, Pranjal, Avrium Blum, et al. “Stability Yields a PTAS for k-Median and k-Means Clustering.” Foundations of Computer Science (2010): 309–18.
-
Computational Lower Bounds
Berthet, Quentin, and Philippe Rigollet. “Computational Lower Bounds for Sparse PCA.” (2013).
Zhang, Yuchen, Martin Wainwright, et al. “Lower Bounds on the Performance of Polynomial-time Algorithms for Sparse Linear Regression.” (2014).
-
New Models
Bilu, Yonatan, and Nathan Linial. “Are Stable Instances Easy?” (2009).
Konstatin, Makarychev, Yury Makarychev, et al. “Bilu-linial Stable Instances of Max Cut and Minimum Multiway Cut.” ACM-SIAM Symposium on Discrete Algorithms (2014).
-
More General Mixture Models
Dasgupta, Anirban, John Hopcroft, et al. “On Learning Mixtures of Heavy-tailed Distributions.” Foundations of Computer Science (2010): 491–500.
Kannan, Ravindran, Hadi Salmasian, et al. “The Spectral Method for General Mixture Models.” Society for Industrial and Applied Mathematics Journal on Computing 38, no. 3 (2005): 444–57.
-
Graphical Models
Bresler, Guy, Elchanan Mossel, et al. “Reconstruction of Markov Random Fields from Samples: Some Easy Observations and Algorithms.” (2010).
Bresler, Guy. “Efficiently Learning Ising Models on Arbitrary Graphs.” (2014).
-
Stochastic Block Model
Mossel, Elchanan, Joe Neeman, et al. “Stochastic Block Models and Reconstruction.” (2012).
Hajek, Bruce, Yihong Wu, et al. “Achieving Exact Cluster Recovery Threshold via Semidefinite Programming.” (2014).
-
Approximate Gradient Descent
Netrapalli, Praneeth, Prateek Jain, et al. “Phase Retrieval using Alternating Minimization.” (2015).
Balakrishnan, Sivaraman, Martin Wainwright, et al. “Statistical Guarantees for the EM Algorithm: From Population to Sample-based Analysis.” (2014).
-
Planted Sparse Vectors
Demanet, Laurent, and Paul Hand. “Scaling Law for Recovering the Sparsest Element in a Subspace.” (2014).
Barak, Boaz, Jonathan Kelner, et al. “Rounding Sum-of-Squares Relaxations.” (2013).
-
Runtime / Statistical Tradeoffs
Shalev-Shwartz, Shai, and Nathan Srebro. “SVM Optimization: Inverse Dependence on Training Set Size.” International Conference on Machine Learning (2008): 928–35.
Daniely, Amit, Nati Linial, et al. “More Data Speeds up Training Time in Learning Halfspaces over Sparse Vectors.” (2013).
-
Non Plug-in Estimators
Wigderson, Avi, and Amir Yehudayoff. “Population Recovery and Partial Identifcation.” Foundations of Computer Scinece (2012): 390–99.
Moitra, Ankur, and Maichael Saks. “A Polynomial Time Algorithm for Lossy Population Recovery.” (PDF) (2013).